LAST SEEN TYPE REASON OBJECT MESSAGE 86m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.9 requirements not yet checked 86m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.9 one or more requirements couldn't be found 86m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.9 all requirements found, attempting install 86m Normal SuccessfulCreate replicaset/cluster-logging-operator-66689c4bbf Created pod: cluster-logging-operator-66689c4bbf-zx4wt 86m Normal Scheduled pod/cluster-logging-operator-66689c4bbf-zx4wt Successfully assigned openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt to crc 86m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-66689c4bbf to 1 86m Normal Pulling pod/cluster-logging-operator-66689c4bbf-zx4wt Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" 86m Normal AddedInterface pod/cluster-logging-operator-66689c4bbf-zx4wt Add eth0 [10.217.0.49/23] from ovn-kubernetes 86m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 waiting for install components to report healthy 86m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.9 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 86m Normal Created pod/cluster-logging-operator-66689c4bbf-zx4wt Created container cluster-logging-operator 86m Normal Started pod/cluster-logging-operator-66689c4bbf-zx4wt Started container cluster-logging-operator 86m Normal Pulled pod/cluster-logging-operator-66689c4bbf-zx4wt Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" in 7.734s (7.734s including waiting). Image size: 343887794 bytes. 86m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 install strategy completed with no errors 86m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 86m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-ff66c4dc9 to 1 86m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-599d7cd94d to 2 86m Normal SuccessfulCreate replicaset/logging-loki-gateway-599d7cd94d Created pod: logging-loki-gateway-599d7cd94d-c8sjl 86m Normal SuccessfulCreate replicaset/logging-loki-gateway-599d7cd94d Created pod: logging-loki-gateway-599d7cd94d-7f8hf 86m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 86m Normal Scheduled pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Successfully assigned openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44 to crc 86m Normal NoPods poddisruptionbudget/logging-loki-querier No matching pods found 86m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-6dcbdf8bb8 to 1 86m Normal SuccessfulCreate replicaset/logging-loki-querier-6dcbdf8bb8 Created pod: logging-loki-querier-6dcbdf8bb8-86w5q 86m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 86m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 86m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368 86m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 86m Normal Scheduled pod/logging-loki-gateway-599d7cd94d-c8sjl Successfully assigned openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl to crc 86m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 86m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 86m Normal Scheduled pod/logging-loki-distributor-9c6b6d984-dfcx2 Successfully assigned openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2 to crc 86m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-ff66c4dc9 Created pod: logging-loki-query-frontend-ff66c4dc9-gqm44 86m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e 86m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 86m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 86m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7 86m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 86m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 86m Normal SuccessfulCreate replicaset/logging-loki-distributor-9c6b6d984 Created pod: logging-loki-distributor-9c6b6d984-dfcx2 86m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-9c6b6d984 to 1 86m Normal Scheduled pod/logging-loki-gateway-599d7cd94d-7f8hf Successfully assigned openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf to crc 86m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 86m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-18682c74-0d19-45be-a095-30ed96badb39 86m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 86m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 86m Normal Scheduled pod/logging-loki-querier-6dcbdf8bb8-86w5q Successfully assigned openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q to crc 86m Normal AddedInterface pod/logging-loki-gateway-599d7cd94d-c8sjl Add eth0 [10.217.0.56/23] from ovn-kubernetes 86m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 86m Normal Pulling pod/logging-loki-querier-6dcbdf8bb8-86w5q Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 86m Normal Pulling pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 86m Normal Pulling pod/logging-loki-gateway-599d7cd94d-7f8hf Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 86m Normal AddedInterface pod/logging-loki-gateway-599d7cd94d-7f8hf Add eth0 [10.217.0.55/23] from ovn-kubernetes 86m Normal AddedInterface pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Add eth0 [10.217.0.54/23] from ovn-kubernetes 86m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 86m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 86m Normal AddedInterface pod/logging-loki-querier-6dcbdf8bb8-86w5q Add eth0 [10.217.0.53/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-distributor-9c6b6d984-dfcx2 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 86m Normal AddedInterface pod/logging-loki-distributor-9c6b6d984-dfcx2 Add eth0 [10.217.0.52/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-gateway-599d7cd94d-c8sjl Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 86m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 86m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 86m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 86m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 86m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 86m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 86m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.865s (2.865s including waiting). Image size: 225648085 bytes. 86m Normal Pulled pod/logging-loki-gateway-599d7cd94d-c8sjl Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 3.456s (3.456s including waiting). Image size: 181406084 bytes. 86m Normal Pulled pod/logging-loki-querier-6dcbdf8bb8-86w5q Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.816s (3.816s including waiting). Image size: 225648085 bytes. 86m Normal Pulled pod/logging-loki-gateway-599d7cd94d-7f8hf Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 3.785s (3.785s including waiting). Image size: 181406084 bytes. 86m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.563s (2.563s including waiting). Image size: 225648085 bytes. 86m Normal Created pod/logging-loki-querier-6dcbdf8bb8-86w5q Created container loki-querier 86m Normal Created pod/logging-loki-gateway-599d7cd94d-7f8hf Created container gateway 86m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 86m Normal Pulled pod/logging-loki-distributor-9c6b6d984-dfcx2 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.941s (3.941s including waiting). Image size: 225648085 bytes. 86m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 86m Normal Pulled pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.944s (3.944s including waiting). Image size: 225648085 bytes. 86m Normal Created pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Created container loki-query-frontend 86m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 86m Normal Created pod/logging-loki-gateway-599d7cd94d-c8sjl Created container gateway 86m Normal Created pod/logging-loki-distributor-9c6b6d984-dfcx2 Created container loki-distributor 86m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.773s (2.773s including waiting). Image size: 225648085 bytes. 85m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 85m Normal Started pod/logging-loki-distributor-9c6b6d984-dfcx2 Started container loki-distributor 85m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 85m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 85m Normal Started pod/logging-loki-querier-6dcbdf8bb8-86w5q Started container loki-querier 85m Normal Pulling pod/logging-loki-gateway-599d7cd94d-c8sjl Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 85m Normal Started pod/logging-loki-gateway-599d7cd94d-c8sjl Started container gateway 85m Normal Started pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Started container loki-query-frontend 85m Normal Pulling pod/logging-loki-gateway-599d7cd94d-7f8hf Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 85m Normal Started pod/logging-loki-gateway-599d7cd94d-7f8hf Started container gateway 85m Normal Pulled pod/logging-loki-gateway-599d7cd94d-7f8hf Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.342s (2.342s including waiting). Image size: 161395402 bytes. 85m Normal Pulled pod/logging-loki-gateway-599d7cd94d-c8sjl Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.914s (2.914s including waiting). Image size: 161395402 bytes. 85m Normal Created pod/logging-loki-gateway-599d7cd94d-c8sjl Created container opa 85m Normal Created pod/logging-loki-gateway-599d7cd94d-7f8hf Created container opa 85m Normal Started pod/logging-loki-gateway-599d7cd94d-c8sjl Started container opa 85m Normal Started pod/logging-loki-gateway-599d7cd94d-7f8hf Started container opa 85m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 85m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 85m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 84m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-9vlnr 84m Warning FailedMount pod/collector-9vlnr MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 84m Normal Scheduled pod/collector-9vlnr Successfully assigned openshift-logging/collector-9vlnr to crc 84m Normal SuccessfulCreate daemonset/collector Created pod: collector-9vlnr 84m Normal Scheduled pod/collector-pmw4h Successfully assigned openshift-logging/collector-pmw4h to crc 84m Normal SuccessfulCreate daemonset/collector Created pod: collector-pmw4h 84m Normal AddedInterface pod/collector-pmw4h Add eth0 [10.217.0.65/23] from ovn-kubernetes 84m Normal Pulling pod/collector-pmw4h Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" 84m Normal Pulled pod/collector-pmw4h Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" in 3.543s (3.543s including waiting). Image size: 263139944 bytes. 84m Normal Created pod/collector-pmw4h Created container collector 84m Normal Started pod/collector-pmw4h Started container collector 79m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 79m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 79m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 79m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 79m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-c8sjl Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-c8sjl Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 20m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe error: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe failed: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-86w5q Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe failed: Get "https://10.217.0.55:8081/ready": context deadline exceeded 20m Warning ProbeError pod/logging-loki-distributor-9c6b6d984-dfcx2 Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-dfcx2 Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-86w5q Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-c8sjl Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-c8sjl Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe error: Get "https://10.217.0.55:8081/ready": context deadline exceeded... 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-c8sjl Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-gateway-599d7cd94d-7f8hf Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-gateway-599d7cd94d-c8sjl Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-86w5q Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-dfcx2 Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-distributor-9c6b6d984-dfcx2 Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 20m Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-86w5q Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 20m Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-gqm44 Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)...