LAST SEEN TYPE REASON OBJECT MESSAGE 70m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.9 requirements not yet checked 70m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.9 one or more requirements couldn't be found 70m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.9 all requirements found, attempting install 70m Normal SuccessfulCreate replicaset/cluster-logging-operator-66689c4bbf Created pod: cluster-logging-operator-66689c4bbf-jdjgh 70m Normal Pulling pod/cluster-logging-operator-66689c4bbf-jdjgh Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" 70m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-66689c4bbf to 1 70m Normal Scheduled pod/cluster-logging-operator-66689c4bbf-jdjgh Successfully assigned openshift-logging/cluster-logging-operator-66689c4bbf-jdjgh to crc 70m Normal AddedInterface pod/cluster-logging-operator-66689c4bbf-jdjgh Add eth0 [10.217.0.49/23] from ovn-kubernetes 70m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 waiting for install components to report healthy 70m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.9 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 70m Normal Pulled pod/cluster-logging-operator-66689c4bbf-jdjgh Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" in 7.015s (7.015s including waiting). Image size: 343887794 bytes. 70m Normal Created pod/cluster-logging-operator-66689c4bbf-jdjgh Created container cluster-logging-operator 70m Normal Started pod/cluster-logging-operator-66689c4bbf-jdjgh Started container cluster-logging-operator 70m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 install strategy completed with no errors 70m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-6a19384e-dd71-4033-8cc0-90925ec48a23 70m Normal Scheduled pod/logging-loki-querier-6dcbdf8bb8-ljg58 Successfully assigned openshift-logging/logging-loki-querier-6dcbdf8bb8-ljg58 to crc 70m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 70m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 70m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-ff66c4dc9 Created pod: logging-loki-query-frontend-ff66c4dc9-z95d6 70m Normal Scheduled pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Successfully assigned openshift-logging/logging-loki-query-frontend-ff66c4dc9-z95d6 to crc 70m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-6dcbdf8bb8 to 1 70m Normal SuccessfulCreate replicaset/logging-loki-querier-6dcbdf8bb8 Created pod: logging-loki-querier-6dcbdf8bb8-ljg58 70m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 70m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 70m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 70m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 70m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-d0b50390-088a-4531-811f-7dd3ae7c4df3 70m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 70m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 70m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 70m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 70m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 70m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 70m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 70m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 70m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 70m Normal Scheduled pod/logging-loki-distributor-9c6b6d984-lmw24 Successfully assigned openshift-logging/logging-loki-distributor-9c6b6d984-lmw24 to crc 70m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 70m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-ff66c4dc9 to 1 70m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 70m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-9c6b6d984 to 1 70m Normal SuccessfulCreate replicaset/logging-loki-distributor-9c6b6d984 Created pod: logging-loki-distributor-9c6b6d984-lmw24 70m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 70m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 70m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-2b8fac3b-73e1-476b-b2f7-6cc2585a3d7b 69m Normal Pulling pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 69m Normal NoPods poddisruptionbudget/logging-loki-gateway No matching pods found 69m Normal Scheduled pod/logging-loki-gateway-5bc6c599cb-vz8rf Successfully assigned openshift-logging/logging-loki-gateway-5bc6c599cb-vz8rf to crc 69m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 69m Normal AddedInterface pod/logging-loki-gateway-5bc6c599cb-vz8rf Add eth0 [10.217.0.55/23] from ovn-kubernetes 69m Normal Scheduled pod/logging-loki-gateway-5bc6c599cb-2gcbl Successfully assigned openshift-logging/logging-loki-gateway-5bc6c599cb-2gcbl to crc 69m Normal Pulling pod/logging-loki-gateway-5bc6c599cb-vz8rf Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 69m Normal AddedInterface pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Add eth0 [10.217.0.54/23] from ovn-kubernetes 69m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 69m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 69m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-5bc6c599cb to 2 69m Normal SuccessfulCreate replicaset/logging-loki-gateway-5bc6c599cb Created pod: logging-loki-gateway-5bc6c599cb-2gcbl 69m Normal SuccessfulCreate replicaset/logging-loki-gateway-5bc6c599cb Created pod: logging-loki-gateway-5bc6c599cb-vz8rf 69m Normal Pulling pod/logging-loki-querier-6dcbdf8bb8-ljg58 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 69m Normal Pulling pod/logging-loki-distributor-9c6b6d984-lmw24 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 69m Normal AddedInterface pod/logging-loki-distributor-9c6b6d984-lmw24 Add eth0 [10.217.0.52/23] from ovn-kubernetes 69m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-790d93c2-e6b1-46fa-942b-76c3b0f2e988 69m Normal AddedInterface pod/logging-loki-querier-6dcbdf8bb8-ljg58 Add eth0 [10.217.0.53/23] from ovn-kubernetes 69m Normal Pulling pod/logging-loki-gateway-5bc6c599cb-2gcbl Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 69m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 69m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 69m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 69m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 69m Normal AddedInterface pod/logging-loki-gateway-5bc6c599cb-2gcbl Add eth0 [10.217.0.56/23] from ovn-kubernetes 69m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 69m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 69m Normal Started pod/logging-loki-distributor-9c6b6d984-lmw24 Started container loki-distributor 69m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.048s (3.048s including waiting). Image size: 225648085 bytes. 69m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.989s (2.989s including waiting). Image size: 225648085 bytes. 69m Normal Pulled pod/logging-loki-distributor-9c6b6d984-lmw24 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 4.032s (4.032s including waiting). Image size: 225648085 bytes. 69m Normal Created pod/logging-loki-distributor-9c6b6d984-lmw24 Created container loki-distributor 69m Normal Pulled pod/logging-loki-gateway-5bc6c599cb-2gcbl Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 3.467s (3.467s including waiting). Image size: 181406084 bytes. 69m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.863s (2.863s including waiting). Image size: 225648085 bytes. 69m Normal Pulled pod/logging-loki-querier-6dcbdf8bb8-ljg58 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.654s (3.654s including waiting). Image size: 225648085 bytes. 69m Normal Pulled pod/logging-loki-gateway-5bc6c599cb-vz8rf Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 3.668s (3.668s including waiting). Image size: 181406084 bytes. 69m Normal Pulled pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.776s (3.776s including waiting). Image size: 225648085 bytes. 69m Normal Pulling pod/logging-loki-gateway-5bc6c599cb-2gcbl Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 69m Normal Started pod/logging-loki-gateway-5bc6c599cb-vz8rf Started container gateway 69m Normal Created pod/logging-loki-gateway-5bc6c599cb-vz8rf Created container gateway 69m Normal Pulling pod/logging-loki-gateway-5bc6c599cb-vz8rf Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 69m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 69m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 69m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 69m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 69m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 69m Normal Created pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Created container loki-query-frontend 69m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 69m Normal Started pod/logging-loki-querier-6dcbdf8bb8-ljg58 Started container loki-querier 69m Normal Created pod/logging-loki-querier-6dcbdf8bb8-ljg58 Created container loki-querier 69m Normal Started pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Started container loki-query-frontend 69m Normal Created pod/logging-loki-gateway-5bc6c599cb-2gcbl Created container gateway 69m Normal Started pod/logging-loki-gateway-5bc6c599cb-2gcbl Started container gateway 69m Normal Created pod/logging-loki-gateway-5bc6c599cb-2gcbl Created container opa 69m Normal Created pod/logging-loki-gateway-5bc6c599cb-vz8rf Created container opa 69m Normal Started pod/logging-loki-gateway-5bc6c599cb-vz8rf Started container opa 69m Normal Pulled pod/logging-loki-gateway-5bc6c599cb-2gcbl Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.058s (2.058s including waiting). Image size: 161395402 bytes. 69m Normal Pulled pod/logging-loki-gateway-5bc6c599cb-vz8rf Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 1.969s (1.969s including waiting). Image size: 161395402 bytes. 69m Normal Started pod/logging-loki-gateway-5bc6c599cb-2gcbl Started container opa 69m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 69m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 69m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 68m Warning FailedMount pod/collector-fvtb2 MountVolume.SetUp failed for volume "collector-syslog-receiver" : secret "collector-syslog-receiver" not found 68m Normal SuccessfulCreate daemonset/collector Created pod: collector-fvtb2 68m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-fvtb2 68m Warning FailedMount pod/collector-fvtb2 MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 68m Normal Scheduled pod/collector-fvtb2 Successfully assigned openshift-logging/collector-fvtb2 to crc 68m Normal AddedInterface pod/collector-rqcnx Add eth0 [10.217.0.65/23] from ovn-kubernetes 68m Normal Pulling pod/collector-rqcnx Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" 68m Normal Scheduled pod/collector-rqcnx Successfully assigned openshift-logging/collector-rqcnx to crc 68m Normal SuccessfulCreate daemonset/collector Created pod: collector-rqcnx 68m Normal Pulled pod/collector-rqcnx Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" in 3.154s (3.154s including waiting). Image size: 263139944 bytes. 68m Normal Started pod/collector-rqcnx Started container collector 68m Normal Created pod/collector-rqcnx Created container collector 62m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 62m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 62m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 62m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 62m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 6m8s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-lmw24 Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m8s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-lmw24 Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m8s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m8s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe failed: Get "https://10.217.0.56:8083/ready": context deadline exceeded 6m8s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe error: Get "https://10.217.0.56:8083/ready": context deadline exceeded... 5m57s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Liveness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m57s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Liveness probe failed: Get "https://10.217.0.56:8083/live": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 5m57s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Liveness probe error: Get "https://10.217.0.55:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m57s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Liveness probe failed: Get "https://10.217.0.55:8083/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m57s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Liveness probe error: Get "https://10.217.0.55:8083/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m57s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Liveness probe failed: Get "https://10.217.0.55:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m57s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Liveness probe error: Get "https://10.217.0.56:8083/live": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 5m57s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Liveness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m57s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Liveness probe failed: HTTP probe failed with statuscode: 503 5m57s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Liveness probe error: HTTP probe failed with statuscode: 503... 5m57s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-lmw24 Liveness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m57s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-lmw24 Liveness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe failed: Get "https://10.217.0.55:8081/ready": context deadline exceeded 5m53s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe error: Get "https://10.217.0.55:8081/ready": context deadline exceeded... 5m53s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe error: Get "https://10.217.0.55:8083/ready": context deadline exceeded... 5m53s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe failed: Get "https://10.217.0.55:8083/ready": context deadline exceeded 5m53s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m48s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe failed: Get "https://10.217.0.56:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 5m48s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m48s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m48s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-ljg58 Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m48s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-ljg58 Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m48s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe error: Get "https://10.217.0.56:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 5m47s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m47s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m47s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m47s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m47s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m47s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m43s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m43s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m43s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe error: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m43s Warning Unhealthy pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe failed: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m38s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": context deadline exceeded 5m38s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-ljg58 Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m38s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-ljg58 Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m38s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-z95d6 Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": context deadline exceeded... 5m38s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-vz8rf Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m38s Warning ProbeError pod/logging-loki-gateway-5bc6c599cb-2gcbl Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m38s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-lmw24 Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m38s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-lmw24 Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m31s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 5m31s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 5m31s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 5m29s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2"