LAST SEEN TYPE REASON OBJECT MESSAGE 78m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 77m Normal Scheduled pod/collector-9d54d Successfully assigned openshift-logging/collector-9d54d to crc 79m Normal Scheduled pod/cluster-logging-operator-79cf69ddc8-mzrfj Successfully assigned openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj to crc 78m Normal Scheduled pod/logging-loki-gateway-66cd7bf4cd-8svgw Successfully assigned openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw to crc 78m Normal Scheduled pod/logging-loki-querier-76788598db-n78fw Successfully assigned openshift-logging/logging-loki-querier-76788598db-n78fw to crc 78m Normal Scheduled pod/logging-loki-gateway-66cd7bf4cd-8vc2p Successfully assigned openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p to crc 78m Normal Scheduled pod/logging-loki-query-frontend-69d9546745-lbmjc Successfully assigned openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc to crc 78m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 77m Normal Scheduled pod/collector-bx9tf Successfully assigned openshift-logging/collector-bx9tf to crc 78m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 78m Normal Scheduled pod/logging-loki-distributor-5f678c8dd6-bdcsh Successfully assigned openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh to crc 79m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.7 requirements not yet checked 79m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.7 one or more requirements couldn't be found 79m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.7 all requirements found, attempting install 79m Normal SuccessfulCreate replicaset/cluster-logging-operator-79cf69ddc8 Created pod: cluster-logging-operator-79cf69ddc8-mzrfj 79m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.7 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 79m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.7 waiting for install components to report healthy 79m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-79cf69ddc8 to 1 79m Normal Pulling pod/cluster-logging-operator-79cf69ddc8-mzrfj Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" 79m Normal AddedInterface pod/cluster-logging-operator-79cf69ddc8-mzrfj Add eth0 [10.217.0.47/23] from ovn-kubernetes 79m Normal Started pod/cluster-logging-operator-79cf69ddc8-mzrfj Started container cluster-logging-operator 79m Normal Created pod/cluster-logging-operator-79cf69ddc8-mzrfj Created container cluster-logging-operator 79m Normal Pulled pod/cluster-logging-operator-79cf69ddc8-mzrfj Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" in 5.759s (5.759s including waiting). Image size: 343173849 bytes. 79m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.7 install strategy completed with no errors 78m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 78m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4 78m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 78m Normal SuccessfulCreate replicaset/logging-loki-gateway-66cd7bf4cd Created pod: logging-loki-gateway-66cd7bf4cd-8svgw 78m Normal SuccessfulCreate replicaset/logging-loki-gateway-66cd7bf4cd Created pod: logging-loki-gateway-66cd7bf4cd-8vc2p 78m Normal NoPods poddisruptionbudget/logging-loki-querier No matching pods found 78m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76788598db to 1 78m Normal SuccessfulCreate replicaset/logging-loki-querier-76788598db Created pod: logging-loki-querier-76788598db-n78fw 78m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 78m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 78m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-66cd7bf4cd to 2 78m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-69d9546745 to 1 78m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 78m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-69d9546745 Created pod: logging-loki-query-frontend-69d9546745-lbmjc 78m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 78m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 78m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-3575dd88-8679-404a-b9cc-3c5284583b9a 78m Normal WaitForPodScheduled persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for pod logging-loki-index-gateway-0 to be scheduled 78m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 78m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-4457b64d-e326-4f6d-b369-671bb6564cb7 78m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 78m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 78m Normal SuccessfulCreate replicaset/logging-loki-distributor-5f678c8dd6 Created pod: logging-loki-distributor-5f678c8dd6-bdcsh 78m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5f678c8dd6 to 1 78m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-58501ad4-9747-4134-9b58-7da29b69fd5d 78m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 78m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 78m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 78m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 78m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 78m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 78m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 78m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 78m Warning FailedMount pod/logging-loki-gateway-66cd7bf4cd-8svgw MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 78m Normal AddedInterface pod/logging-loki-distributor-5f678c8dd6-bdcsh Add eth0 [10.217.0.50/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-querier-76788598db-n78fw Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 78m Normal AddedInterface pod/logging-loki-querier-76788598db-n78fw Add eth0 [10.217.0.51/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-distributor-5f678c8dd6-bdcsh Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 78m Warning FailedMount pod/logging-loki-gateway-66cd7bf4cd-8vc2p MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 78m Normal AddedInterface pod/logging-loki-query-frontend-69d9546745-lbmjc Add eth0 [10.217.0.52/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-query-frontend-69d9546745-lbmjc Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 78m Normal Pulling pod/logging-loki-gateway-66cd7bf4cd-8svgw Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" 78m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 78m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 78m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.55/23] from ovn-kubernetes 78m Normal AddedInterface pod/logging-loki-gateway-66cd7bf4cd-8svgw Add eth0 [10.217.0.53/23] from ovn-kubernetes 78m Normal Pulling pod/logging-loki-gateway-66cd7bf4cd-8vc2p Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" 78m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 78m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.56/23] from ovn-kubernetes 78m Normal AddedInterface pod/logging-loki-gateway-66cd7bf4cd-8vc2p Add eth0 [10.217.0.54/23] from ovn-kubernetes 78m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 78m Normal Pulled pod/logging-loki-distributor-5f678c8dd6-bdcsh Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 4.193s (4.193s including waiting). Image size: 225282830 bytes. 78m Normal Pulled pod/logging-loki-query-frontend-69d9546745-lbmjc Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.919s (3.919s including waiting). Image size: 225282830 bytes. 78m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 2.982s (2.982s including waiting). Image size: 225282830 bytes. 78m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.081s (3.081s including waiting). Image size: 225282830 bytes. 78m Normal Created pod/logging-loki-distributor-5f678c8dd6-bdcsh Created container loki-distributor 78m Normal Pulled pod/logging-loki-querier-76788598db-n78fw Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 4.172s (4.172s including waiting). Image size: 225282830 bytes. 78m Normal Pulled pod/logging-loki-gateway-66cd7bf4cd-8vc2p Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" in 3.28s (3.28s including waiting). Image size: 180263561 bytes. 78m Normal Started pod/logging-loki-distributor-5f678c8dd6-bdcsh Started container loki-distributor 78m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.288s (3.288s including waiting). Image size: 225282830 bytes. 78m Normal Pulled pod/logging-loki-gateway-66cd7bf4cd-8svgw Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" in 3.547s (3.547s including waiting). Image size: 180263561 bytes. 78m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 78m Normal Started pod/logging-loki-query-frontend-69d9546745-lbmjc Started container loki-query-frontend 78m Normal Created pod/logging-loki-gateway-66cd7bf4cd-8svgw Created container gateway 78m Normal Started pod/logging-loki-gateway-66cd7bf4cd-8svgw Started container gateway 78m Normal Pulling pod/logging-loki-gateway-66cd7bf4cd-8svgw Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" 78m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 78m Normal Started pod/logging-loki-querier-76788598db-n78fw Started container loki-querier 78m Normal Created pod/logging-loki-gateway-66cd7bf4cd-8vc2p Created container gateway 78m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 78m Normal Created pod/logging-loki-query-frontend-69d9546745-lbmjc Created container loki-query-frontend 78m Normal Pulling pod/logging-loki-gateway-66cd7bf4cd-8vc2p Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" 78m Normal Started pod/logging-loki-gateway-66cd7bf4cd-8vc2p Started container gateway 78m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 78m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 78m Normal Created pod/logging-loki-querier-76788598db-n78fw Created container loki-querier 78m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 78m Normal Started pod/logging-loki-gateway-66cd7bf4cd-8vc2p Started container opa 78m Normal Created pod/logging-loki-gateway-66cd7bf4cd-8svgw Created container opa 78m Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe error: Get "https://10.217.0.53:8083/ready": dial tcp 10.217.0.53:8083: connect: connection refused... 78m Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe failed: Get "https://10.217.0.53:8083/ready": dial tcp 10.217.0.53:8083: connect: connection refused 78m Normal Started pod/logging-loki-gateway-66cd7bf4cd-8svgw Started container opa 78m Normal Pulled pod/logging-loki-gateway-66cd7bf4cd-8vc2p Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" in 2.582s (2.582s including waiting). Image size: 160768505 bytes. 78m Normal Created pod/logging-loki-gateway-66cd7bf4cd-8vc2p Created container opa 78m Normal Pulled pod/logging-loki-gateway-66cd7bf4cd-8svgw Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" in 2.548s (2.548s including waiting). Image size: 160768505 bytes. 78m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 77m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 77m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 77m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-9d54d 77m Normal SuccessfulCreate daemonset/collector Created pod: collector-9d54d 77m Normal SuccessfulCreate daemonset/collector Created pod: collector-bx9tf 77m Normal AddedInterface pod/collector-bx9tf Add eth0 [10.217.0.63/23] from ovn-kubernetes 77m Normal Pulling pod/collector-bx9tf Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:fa2cfa2ed336ce105c8dea5bfe0825407e37ef296193ae162f515213fe43c8d5" 77m Normal Pulled pod/collector-bx9tf Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:fa2cfa2ed336ce105c8dea5bfe0825407e37ef296193ae162f515213fe43c8d5" in 6.815s (6.815s including waiting). Image size: 344557702 bytes. 77m Normal Started pod/collector-bx9tf Started container collector 77m Normal Created pod/collector-bx9tf Created container collector 71m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 71m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 71m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 71m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 71m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 20m Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe error: HTTP probe failed with statuscode: 503... 20m Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe failed: HTTP probe failed with statuscode: 503 8m3s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8svgw Liveness probe failed: Get "https://10.217.0.53:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 8m3s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8svgw Liveness probe error: Get "https://10.217.0.53:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m49s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe failed: Get "https://10.217.0.53:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m49s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe error: Get "https://10.217.0.53:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m44s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe failed: Get "https://10.217.0.54:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m44s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe error: Get "https://10.217.0.54:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m43s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.55:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m43s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-lbmjc Readiness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m43s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.56:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m43s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m43s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.55:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m43s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.56:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m43s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-lbmjc Readiness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m43s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m39s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe failed: Get "https://10.217.0.53:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m39s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe error: Get "https://10.217.0.53:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m39s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe error: Get "https://10.217.0.54:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m39s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe error: Get "https://10.217.0.54:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m39s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe failed: Get "https://10.217.0.54:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m39s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe failed: Get "https://10.217.0.54:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m35s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-bdcsh Readiness probe failed: Get "https://10.217.0.50:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m35s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-bdcsh Readiness probe error: Get "https://10.217.0.50:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m34s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-bdcsh Liveness probe error: Get "https://10.217.0.50:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m34s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe failed: Get "https://10.217.0.53:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m34s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-lbmjc Readiness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 7m34s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-lbmjc Readiness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 7m34s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-bdcsh Liveness probe failed: Get "https://10.217.0.50:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m34s Warning Unhealthy pod/logging-loki-querier-76788598db-n78fw Readiness probe failed: Get "https://10.217.0.51:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m34s Warning ProbeError pod/logging-loki-querier-76788598db-n78fw Readiness probe error: Get "https://10.217.0.51:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m33s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-lbmjc Liveness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m33s Warning Unhealthy pod/logging-loki-querier-76788598db-n78fw Liveness probe failed: Get "https://10.217.0.51:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m33s Warning ProbeError pod/logging-loki-querier-76788598db-n78fw Liveness probe error: Get "https://10.217.0.51:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m33s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-lbmjc Liveness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m25s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-bdcsh Readiness probe error: Get "https://10.217.0.50:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m25s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-bdcsh Readiness probe failed: Get "https://10.217.0.50:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m24s Warning Unhealthy pod/logging-loki-querier-76788598db-n78fw Readiness probe failed: Get "https://10.217.0.51:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m24s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe error: Get "https://10.217.0.54:8081/ready": context deadline exceeded... 7m24s Warning Unhealthy pod/logging-loki-gateway-66cd7bf4cd-8vc2p Readiness probe failed: Get "https://10.217.0.54:8081/ready": context deadline exceeded 7m24s Warning ProbeError pod/logging-loki-gateway-66cd7bf4cd-8svgw Readiness probe error: Get "https://10.217.0.53:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m24s Warning ProbeError pod/logging-loki-querier-76788598db-n78fw Readiness probe error: Get "https://10.217.0.51:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m24s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-lbmjc Readiness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m24s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-lbmjc Readiness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)