LAST SEEN TYPE REASON OBJECT MESSAGE 79m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.8 requirements not yet checked 79m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.8 one or more requirements couldn't be found 79m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 waiting for install components to report healthy 79m Normal Scheduled pod/cluster-logging-operator-c769fd969-rb6br Successfully assigned openshift-logging/cluster-logging-operator-c769fd969-rb6br to crc 79m Normal SuccessfulCreate replicaset/cluster-logging-operator-c769fd969 Created pod: cluster-logging-operator-c769fd969-rb6br 79m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-c769fd969 to 1 79m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.8 all requirements found, attempting install 79m Normal AddedInterface pod/cluster-logging-operator-c769fd969-rb6br Add eth0 [10.217.0.50/23] from ovn-kubernetes 79m Normal Pulling pod/cluster-logging-operator-c769fd969-rb6br Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" 79m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.8 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 79m Normal Pulled pod/cluster-logging-operator-c769fd969-rb6br Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" in 7.731s (7.731s including waiting). Image size: 343181526 bytes. 79m Normal Started pod/cluster-logging-operator-c769fd969-rb6br Started container cluster-logging-operator 79m Normal Created pod/cluster-logging-operator-c769fd969-rb6br Created container cluster-logging-operator 79m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 install strategy completed with no errors 79m Normal Scheduled pod/logging-loki-distributor-5d5548c9f5-lllg8 Successfully assigned openshift-logging/logging-loki-distributor-5d5548c9f5-lllg8 to crc 79m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 79m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 79m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 79m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 79m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 79m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5d5548c9f5 to 1 79m Normal SuccessfulCreate replicaset/logging-loki-distributor-5d5548c9f5 Created pod: logging-loki-distributor-5d5548c9f5-lllg8 79m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 79m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 79m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 79m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76bf7b6d45 to 1 79m Normal Pulling pod/logging-loki-querier-76bf7b6d45-d8m5w Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 79m Normal AddedInterface pod/logging-loki-querier-76bf7b6d45-d8m5w Add eth0 [10.217.0.54/23] from ovn-kubernetes 79m Normal Scheduled pod/logging-loki-querier-76bf7b6d45-d8m5w Successfully assigned openshift-logging/logging-loki-querier-76bf7b6d45-d8m5w to crc 79m Normal Scheduled pod/logging-loki-query-frontend-6d6859c548-nv5fd Successfully assigned openshift-logging/logging-loki-query-frontend-6d6859c548-nv5fd to crc 79m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-4ed09cde-a991-420a-a4db-81797b30c03d 79m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 79m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 79m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 79m Normal AddedInterface pod/logging-loki-query-frontend-6d6859c548-nv5fd Add eth0 [10.217.0.55/23] from ovn-kubernetes 79m Normal AddedInterface pod/logging-loki-distributor-5d5548c9f5-lllg8 Add eth0 [10.217.0.53/23] from ovn-kubernetes 79m Normal Pulling pod/logging-loki-distributor-5d5548c9f5-lllg8 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 79m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 79m Normal NoPods poddisruptionbudget/logging-loki-index-gateway No matching pods found 79m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 79m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 79m Normal Pulling pod/logging-loki-query-frontend-6d6859c548-nv5fd Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 79m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-1b6df600-5e2b-4e1a-b1b6-1bae030c2348 79m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 79m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-568bb59667 to 2 79m Normal SuccessfulCreate replicaset/logging-loki-gateway-568bb59667 Created pod: logging-loki-gateway-568bb59667-znjxl 79m Normal Scheduled pod/logging-loki-gateway-568bb59667-ctm8g Successfully assigned openshift-logging/logging-loki-gateway-568bb59667-ctm8g to crc 79m Warning FailedMount pod/logging-loki-gateway-568bb59667-ctm8g MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 79m Normal SuccessfulCreate replicaset/logging-loki-gateway-568bb59667 Created pod: logging-loki-gateway-568bb59667-ctm8g 79m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-6d6859c548 Created pod: logging-loki-query-frontend-6d6859c548-nv5fd 79m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-6d6859c548 to 1 79m Warning FailedMount pod/logging-loki-gateway-568bb59667-znjxl MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 79m Normal Scheduled pod/logging-loki-gateway-568bb59667-znjxl Successfully assigned openshift-logging/logging-loki-gateway-568bb59667-znjxl to crc 79m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-compactor-0 waiting for first consumer to be created before binding 79m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 79m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 79m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-de69c665-b66b-48ba-922b-fa653a86cd6a 79m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 79m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 79m Normal SuccessfulCreate replicaset/logging-loki-querier-76bf7b6d45 Created pod: logging-loki-querier-76bf7b6d45-d8m5w 79m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-ffdd6d64-3323-46fd-b882-04be82820142 78m Normal Pulling pod/logging-loki-gateway-568bb59667-ctm8g Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 78m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 78m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.61/23] from ovn-kubernetes 78m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 78m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 78m Normal AddedInterface pod/logging-loki-gateway-568bb59667-znjxl Add eth0 [10.217.0.57/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-gateway-568bb59667-znjxl Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 78m Normal AddedInterface pod/logging-loki-gateway-568bb59667-ctm8g Add eth0 [10.217.0.56/23] from ovn-kubernetes 78m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 78m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 78m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 78m Normal Pulled pod/logging-loki-gateway-568bb59667-znjxl Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 5.413s (5.413s including waiting). Image size: 174532765 bytes. 78m Normal Started pod/logging-loki-query-frontend-6d6859c548-nv5fd Started container loki-query-frontend 78m Normal Created pod/logging-loki-distributor-5d5548c9f5-lllg8 Created container loki-distributor 78m Normal Pulled pod/logging-loki-querier-76bf7b6d45-d8m5w Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.349s (6.349s including waiting). Image size: 225276683 bytes. 78m Normal Started pod/logging-loki-gateway-568bb59667-znjxl Started container gateway 78m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 4.987s (4.987s including waiting). Image size: 225276683 bytes. 78m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 78m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 78m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 78m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 78m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 5.19s (5.19s including waiting). Image size: 225276683 bytes. 78m Normal Pulled pod/logging-loki-gateway-568bb59667-ctm8g Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 5.435s (5.435s including waiting). Image size: 174532765 bytes. 78m Normal Created pod/logging-loki-gateway-568bb59667-znjxl Created container gateway 78m Normal Pulling pod/logging-loki-gateway-568bb59667-znjxl Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 78m Normal Created pod/logging-loki-query-frontend-6d6859c548-nv5fd Created container loki-query-frontend 78m Normal Pulled pod/logging-loki-distributor-5d5548c9f5-lllg8 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.173s (6.173s including waiting). Image size: 225276683 bytes. 78m Normal Created pod/logging-loki-gateway-568bb59667-ctm8g Created container gateway 78m Normal Started pod/logging-loki-gateway-568bb59667-ctm8g Started container gateway 78m Normal Pulling pod/logging-loki-gateway-568bb59667-ctm8g Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 78m Normal Started pod/logging-loki-distributor-5d5548c9f5-lllg8 Started container loki-distributor 78m Normal Created pod/logging-loki-querier-76bf7b6d45-d8m5w Created container loki-querier 78m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 5.108s (5.108s including waiting). Image size: 225276683 bytes. 78m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 78m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 78m Normal Pulled pod/logging-loki-query-frontend-6d6859c548-nv5fd Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.12s (6.12s including waiting). Image size: 225276683 bytes. 78m Normal Started pod/logging-loki-querier-76bf7b6d45-d8m5w Started container loki-querier 78m Normal Pulled pod/logging-loki-gateway-568bb59667-ctm8g Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 2.409s (2.409s including waiting). Image size: 160762347 bytes. 78m Normal Pulled pod/logging-loki-gateway-568bb59667-znjxl Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 2.406s (2.406s including waiting). Image size: 160762347 bytes. 78m Normal Started pod/logging-loki-gateway-568bb59667-znjxl Started container opa 78m Normal Created pod/logging-loki-gateway-568bb59667-znjxl Created container opa 78m Normal Started pod/logging-loki-gateway-568bb59667-ctm8g Started container opa 78m Normal Created pod/logging-loki-gateway-568bb59667-ctm8g Created container opa 78m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 78m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 78m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 77m Normal Scheduled pod/collector-szh9k Successfully assigned openshift-logging/collector-szh9k to crc 77m Normal SuccessfulCreate daemonset/collector Created pod: collector-szh9k 77m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-szh9k 77m Normal SuccessfulCreate daemonset/collector Created pod: collector-bdmnm 77m Normal Scheduled pod/collector-bdmnm Successfully assigned openshift-logging/collector-bdmnm to crc 77m Normal Pulling pod/collector-bdmnm Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" 77m Normal AddedInterface pod/collector-bdmnm Add eth0 [10.217.0.65/23] from ovn-kubernetes 77m Normal Started pod/collector-bdmnm Started container collector 77m Normal Pulled pod/collector-bdmnm Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" in 5.423s (5.423s including waiting). Image size: 320720510 bytes. 77m Normal Created pod/collector-bdmnm Created container collector 72m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 72m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 72m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 72m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 72m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 66m Warning ProbeError pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe error: ... 7m29s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-d8m5w Readiness probe error: Get "https://10.217.0.54:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 7m29s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-d8m5w Readiness probe failed: Get "https://10.217.0.54:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 7m27s Warning Unhealthy pod/logging-loki-gateway-568bb59667-znjxl Liveness probe failed: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m27s Warning Unhealthy pod/logging-loki-gateway-568bb59667-ctm8g Liveness probe failed: Get "https://10.217.0.56:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m27s Warning ProbeError pod/logging-loki-gateway-568bb59667-ctm8g Liveness probe error: Get "https://10.217.0.56:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m27s Warning ProbeError pod/logging-loki-gateway-568bb59667-znjxl Liveness probe error: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m19s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-d8m5w Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m19s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-lllg8 Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m19s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-d8m5w Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m19s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-lllg8 Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m19s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-nv5fd Readiness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m19s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-nv5fd Readiness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m18s Warning ProbeError pod/logging-loki-gateway-568bb59667-znjxl Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m18s Warning Unhealthy pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m18s Warning ProbeError pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m18s Warning Unhealthy pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m18s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m18s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m18s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m18s Warning Unhealthy pod/logging-loki-gateway-568bb59667-znjxl Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m18s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m18s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m18s Warning Unhealthy pod/logging-loki-gateway-568bb59667-znjxl Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m18s Warning ProbeError pod/logging-loki-gateway-568bb59667-znjxl Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m18s Warning ProbeError pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m18s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning Unhealthy pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m13s Warning ProbeError pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning ProbeError pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning Unhealthy pod/logging-loki-gateway-568bb59667-ctm8g Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m13s Warning ProbeError pod/logging-loki-gateway-568bb59667-znjxl Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning ProbeError pod/logging-loki-gateway-568bb59667-znjxl Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning Unhealthy pod/logging-loki-gateway-568bb59667-znjxl Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m13s Warning Unhealthy pod/logging-loki-gateway-568bb59667-znjxl Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m9s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-nv5fd Readiness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m9s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-lllg8 Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m9s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-lllg8 Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m9s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-d8m5w Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m9s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-d8m5w Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m9s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-nv5fd Readiness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)...