LAST SEEN TYPE REASON OBJECT MESSAGE 94m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.8 requirements not yet checked 94m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.8 one or more requirements couldn't be found 94m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.8 all requirements found, attempting install 94m Normal SuccessfulCreate replicaset/cluster-logging-operator-c769fd969 Created pod: cluster-logging-operator-c769fd969-vqmnv 94m Normal Scheduled pod/cluster-logging-operator-c769fd969-vqmnv Successfully assigned openshift-logging/cluster-logging-operator-c769fd969-vqmnv to crc 94m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-c769fd969 to 1 94m Normal Pulling pod/cluster-logging-operator-c769fd969-vqmnv Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" 94m Normal AddedInterface pod/cluster-logging-operator-c769fd969-vqmnv Add eth0 [10.217.0.50/23] from ovn-kubernetes 94m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 waiting for install components to report healthy 94m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.8 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 94m Normal Pulled pod/cluster-logging-operator-c769fd969-vqmnv Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" in 11.515s (11.515s including waiting). Image size: 343181526 bytes. 94m Normal Created pod/cluster-logging-operator-c769fd969-vqmnv Created container cluster-logging-operator 94m Normal Started pod/cluster-logging-operator-c769fd969-vqmnv Started container cluster-logging-operator 93m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 install strategy completed with no errors 93m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 93m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 93m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 93m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 93m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-6d6859c548 to 1 93m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-6d6859c548 Created pod: logging-loki-query-frontend-6d6859c548-wjt7d 93m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-1c77188b-c680-4368-aae1-fd7aa00f07c4 93m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 93m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 93m Normal Scheduled pod/logging-loki-gateway-7bbb966984-qjtwt Successfully assigned openshift-logging/logging-loki-gateway-7bbb966984-qjtwt to crc 93m Normal Scheduled pod/logging-loki-query-frontend-6d6859c548-wjt7d Successfully assigned openshift-logging/logging-loki-query-frontend-6d6859c548-wjt7d to crc 93m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 93m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76bf7b6d45 to 1 93m Normal SuccessfulCreate replicaset/logging-loki-querier-76bf7b6d45 Created pod: logging-loki-querier-76bf7b6d45-kvhl6 93m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 93m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 93m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 93m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 93m Normal Scheduled pod/logging-loki-querier-76bf7b6d45-kvhl6 Successfully assigned openshift-logging/logging-loki-querier-76bf7b6d45-kvhl6 to crc 93m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-7bbb966984 to 2 93m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 93m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 93m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 93m Normal Scheduled pod/logging-loki-distributor-5d5548c9f5-fvl8r Successfully assigned openshift-logging/logging-loki-distributor-5d5548c9f5-fvl8r to crc 93m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-bdb35603-8302-4e1d-8ed3-c1881e0afb8c 93m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-fc93a735-0cf8-47eb-9bf2-a3246c8a254a 93m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 93m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 93m Warning FailedCreate replicaset/logging-loki-gateway-7bbb966984 Error creating: pods "logging-loki-gateway-7bbb966984-" is forbidden: error looking up service account openshift-logging/logging-loki-gateway: serviceaccount "logging-loki-gateway" not found 93m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 93m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-ee69c683-d7dc-4728-9744-3ca8faba19ba 93m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 93m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 93m Normal SuccessfulCreate replicaset/logging-loki-distributor-5d5548c9f5 Created pod: logging-loki-distributor-5d5548c9f5-fvl8r 93m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5d5548c9f5 to 1 93m Normal Scheduled pod/logging-loki-gateway-7bbb966984-jqlhm Successfully assigned openshift-logging/logging-loki-gateway-7bbb966984-jqlhm to crc 93m Normal SuccessfulCreate replicaset/logging-loki-gateway-7bbb966984 Created pod: logging-loki-gateway-7bbb966984-jqlhm 93m Normal SuccessfulCreate replicaset/logging-loki-gateway-7bbb966984 Created pod: logging-loki-gateway-7bbb966984-qjtwt 93m Normal Pulling pod/logging-loki-distributor-5d5548c9f5-fvl8r Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 93m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 93m Normal AddedInterface pod/logging-loki-gateway-7bbb966984-jqlhm Add eth0 [10.217.0.56/23] from ovn-kubernetes 93m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 93m Normal AddedInterface pod/logging-loki-distributor-5d5548c9f5-fvl8r Add eth0 [10.217.0.53/23] from ovn-kubernetes 93m Normal AddedInterface pod/logging-loki-querier-76bf7b6d45-kvhl6 Add eth0 [10.217.0.54/23] from ovn-kubernetes 93m Normal Pulling pod/logging-loki-querier-76bf7b6d45-kvhl6 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 93m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 93m Normal Pulling pod/logging-loki-gateway-7bbb966984-qjtwt Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 93m Normal AddedInterface pod/logging-loki-gateway-7bbb966984-qjtwt Add eth0 [10.217.0.57/23] from ovn-kubernetes 93m Normal AddedInterface pod/logging-loki-query-frontend-6d6859c548-wjt7d Add eth0 [10.217.0.55/23] from ovn-kubernetes 93m Normal Pulling pod/logging-loki-query-frontend-6d6859c548-wjt7d Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 93m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 93m Normal Pulling pod/logging-loki-gateway-7bbb966984-jqlhm Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 93m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 93m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 93m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 93m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 93m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.61/23] from ovn-kubernetes 93m Normal Pulled pod/logging-loki-query-frontend-6d6859c548-wjt7d Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 8.234s (8.234s including waiting). Image size: 225276683 bytes. 93m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 7.357s (7.357s including waiting). Image size: 225276683 bytes. 93m Normal Pulled pod/logging-loki-distributor-5d5548c9f5-fvl8r Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 8.306s (8.306s including waiting). Image size: 225276683 bytes. 93m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 7.449s (7.449s including waiting). Image size: 225276683 bytes. 93m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 93m Normal Created pod/logging-loki-query-frontend-6d6859c548-wjt7d Created container loki-query-frontend 93m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 93m Normal Created pod/logging-loki-distributor-5d5548c9f5-fvl8r Created container loki-distributor 93m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 93m Normal Started pod/logging-loki-distributor-5d5548c9f5-fvl8r Started container loki-distributor 93m Normal Started pod/logging-loki-query-frontend-6d6859c548-wjt7d Started container loki-query-frontend 93m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 93m Normal Pulled pod/logging-loki-querier-76bf7b6d45-kvhl6 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 9.663s (9.663s including waiting). Image size: 225276683 bytes. 93m Normal Created pod/logging-loki-querier-76bf7b6d45-kvhl6 Created container loki-querier 93m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 93m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 93m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 8.62s (8.62s including waiting). Image size: 225276683 bytes. 93m Normal Started pod/logging-loki-querier-76bf7b6d45-kvhl6 Started container loki-querier 93m Normal Started pod/logging-loki-gateway-7bbb966984-jqlhm Started container gateway 93m Normal Pulled pod/logging-loki-gateway-7bbb966984-qjtwt Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 11.426s (11.426s including waiting). Image size: 174532765 bytes. 93m Normal Pulling pod/logging-loki-gateway-7bbb966984-qjtwt Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 93m Normal Created pod/logging-loki-gateway-7bbb966984-qjtwt Created container gateway 93m Normal Pulling pod/logging-loki-gateway-7bbb966984-jqlhm Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 93m Normal Started pod/logging-loki-gateway-7bbb966984-qjtwt Started container gateway 93m Normal Created pod/logging-loki-gateway-7bbb966984-jqlhm Created container gateway 93m Normal Pulled pod/logging-loki-gateway-7bbb966984-jqlhm Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 11.343s (11.343s including waiting). Image size: 174532765 bytes. 93m Normal Pulled pod/logging-loki-gateway-7bbb966984-qjtwt Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 5.497s (5.497s including waiting). Image size: 160762347 bytes. 93m Normal Started pod/logging-loki-gateway-7bbb966984-qjtwt Started container opa 93m Normal Created pod/logging-loki-gateway-7bbb966984-qjtwt Created container opa 93m Normal Pulled pod/logging-loki-gateway-7bbb966984-jqlhm Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 5.547s (5.547s including waiting). Image size: 160762347 bytes. 93m Normal Started pod/logging-loki-gateway-7bbb966984-jqlhm Started container opa 93m Normal Created pod/logging-loki-gateway-7bbb966984-jqlhm Created container opa 93m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 92m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 92m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 92m Normal SuccessfulCreate daemonset/collector Created pod: collector-tpf9l 92m Normal Scheduled pod/collector-tpf9l Successfully assigned openshift-logging/collector-tpf9l to crc 92m Warning FailedMount pod/collector-tpf9l MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 92m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-tpf9l 92m Normal SuccessfulCreate daemonset/collector Created pod: collector-xvznb 92m Normal Scheduled pod/collector-xvznb Successfully assigned openshift-logging/collector-xvznb to crc 92m Normal Pulling pod/collector-xvznb Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" 92m Normal AddedInterface pod/collector-xvznb Add eth0 [10.217.0.66/23] from ovn-kubernetes 92m Normal Started pod/collector-xvznb Started container collector 92m Normal Created pod/collector-xvznb Created container collector 92m Normal Pulled pod/collector-xvznb Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" in 8.831s (8.831s including waiting). Image size: 320720510 bytes. 85m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 85m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 85m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 85m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 85m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-querier-76bf7b6d45-kvhl6 Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-wjt7d Readiness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-wjt7d Readiness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-kvhl6 Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-wjt7d Liveness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-fvl8r Liveness probe error: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-querier-76bf7b6d45-kvhl6 Liveness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": context deadline exceeded... 21m Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-kvhl6 Liveness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": context deadline exceeded 21m Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-wjt7d Liveness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-fvl8r Liveness probe failed: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-jqlhm Liveness probe error: Get "https://10.217.0.56:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-index-gateway-0 Liveness probe error: Get "https://10.217.0.61:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-qjtwt Liveness probe error: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-index-gateway-0 Liveness probe failed: Get "https://10.217.0.61:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-compactor-0 Liveness probe error: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-compactor-0 Liveness probe failed: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-qjtwt Liveness probe error: Get "https://10.217.0.57:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-qjtwt Liveness probe failed: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-jqlhm Liveness probe failed: Get "https://10.217.0.56:8083/live": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-qjtwt Liveness probe failed: Get "https://10.217.0.57:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-ingester-0 Liveness probe error: Get "https://10.217.0.58:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-ingester-0 Liveness probe failed: Get "https://10.217.0.58:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-jqlhm Liveness probe failed: Get "https://10.217.0.56:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-jqlhm Liveness probe error: Get "https://10.217.0.56:8083/live": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-fvl8r Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-wjt7d Readiness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-wjt7d Readiness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-kvhl6 Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-querier-76bf7b6d45-kvhl6 Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-fvl8r Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-qjtwt Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 21m Warning Unhealthy pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 21m Warning ProbeError pod/logging-loki-gateway-7bbb966984-jqlhm Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)...