LAST SEEN TYPE REASON OBJECT MESSAGE 92m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.8 requirements not yet checked 92m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.8 one or more requirements couldn't be found 92m Normal SuccessfulCreate replicaset/cluster-logging-operator-c769fd969 Created pod: cluster-logging-operator-c769fd969-k82gj 92m Normal Scheduled pod/cluster-logging-operator-c769fd969-k82gj Successfully assigned openshift-logging/cluster-logging-operator-c769fd969-k82gj to crc 92m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-c769fd969 to 1 92m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.8 all requirements found, attempting install 92m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 waiting for install components to report healthy 92m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.8 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 92m Normal Pulling pod/cluster-logging-operator-c769fd969-k82gj Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" 92m Normal AddedInterface pod/cluster-logging-operator-c769fd969-k82gj Add eth0 [10.217.0.51/23] from ovn-kubernetes 92m Normal Pulled pod/cluster-logging-operator-c769fd969-k82gj Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" in 8.201s (8.201s including waiting). Image size: 343181526 bytes. 92m Normal Created pod/cluster-logging-operator-c769fd969-k82gj Created container cluster-logging-operator 92m Normal Started pod/cluster-logging-operator-c769fd969-k82gj Started container cluster-logging-operator 92m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 install strategy completed with no errors 92m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-58595d78f8 to 2 92m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 92m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 92m Normal Scheduled pod/logging-loki-querier-76bf7b6d45-nsgkb Successfully assigned openshift-logging/logging-loki-querier-76bf7b6d45-nsgkb to crc 92m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 92m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 92m Normal SuccessfulCreate replicaset/logging-loki-querier-76bf7b6d45 Created pod: logging-loki-querier-76bf7b6d45-nsgkb 92m Normal NoPods poddisruptionbudget/logging-loki-querier No matching pods found 92m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 92m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76bf7b6d45 to 1 92m Normal Scheduled pod/logging-loki-query-frontend-6d6859c548-phxp4 Successfully assigned openshift-logging/logging-loki-query-frontend-6d6859c548-phxp4 to crc 92m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 92m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-6d6859c548 Created pod: logging-loki-query-frontend-6d6859c548-phxp4 92m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-6d6859c548 to 1 92m Normal NoPods poddisruptionbudget/logging-loki-query-frontend No matching pods found 92m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-compactor-0 waiting for first consumer to be created before binding 92m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-f5d10b38-9144-4c01-8031-65d46511a3d6 92m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 92m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 92m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 92m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-51bb0221-acb5-43f3-beab-ba1daf8b47d8 92m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 92m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 92m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 92m Normal Scheduled pod/logging-loki-distributor-5d5548c9f5-8fxrr Successfully assigned openshift-logging/logging-loki-distributor-5d5548c9f5-8fxrr to crc 92m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 92m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 92m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-fe0033d2-2dc8-4f66-ad82-b009031f585a 92m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 92m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 92m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5d5548c9f5 to 1 92m Normal SuccessfulCreate replicaset/logging-loki-distributor-5d5548c9f5 Created pod: logging-loki-distributor-5d5548c9f5-8fxrr 92m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 92m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 92m Normal Pulling pod/logging-loki-distributor-5d5548c9f5-8fxrr Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 92m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 92m Normal AddedInterface pod/logging-loki-querier-76bf7b6d45-nsgkb Add eth0 [10.217.0.55/23] from ovn-kubernetes 92m Normal Pulling pod/logging-loki-query-frontend-6d6859c548-phxp4 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 92m Normal Scheduled pod/logging-loki-gateway-58595d78f8-lmbn4 Successfully assigned openshift-logging/logging-loki-gateway-58595d78f8-lmbn4 to crc 92m Normal Pulling pod/logging-loki-gateway-58595d78f8-vq8xm Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 92m Normal AddedInterface pod/logging-loki-gateway-58595d78f8-vq8xm Add eth0 [10.217.0.57/23] from ovn-kubernetes 92m Normal Scheduled pod/logging-loki-gateway-58595d78f8-vq8xm Successfully assigned openshift-logging/logging-loki-gateway-58595d78f8-vq8xm to crc 92m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-d085bf17-b5d3-4ea6-8d5e-ab52b6050983 92m Normal Pulling pod/logging-loki-querier-76bf7b6d45-nsgkb Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 92m Normal AddedInterface pod/logging-loki-distributor-5d5548c9f5-8fxrr Add eth0 [10.217.0.54/23] from ovn-kubernetes 92m Normal SuccessfulCreate replicaset/logging-loki-gateway-58595d78f8 Created pod: logging-loki-gateway-58595d78f8-lmbn4 92m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 92m Normal SuccessfulCreate replicaset/logging-loki-gateway-58595d78f8 Created pod: logging-loki-gateway-58595d78f8-vq8xm 92m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 92m Normal AddedInterface pod/logging-loki-query-frontend-6d6859c548-phxp4 Add eth0 [10.217.0.56/23] from ovn-kubernetes 92m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 92m Normal Pulling pod/logging-loki-gateway-58595d78f8-lmbn4 Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 92m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 92m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 92m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 92m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 92m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.61/23] from ovn-kubernetes 92m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.62/23] from ovn-kubernetes 92m Normal AddedInterface pod/logging-loki-gateway-58595d78f8-lmbn4 Add eth0 [10.217.0.58/23] from ovn-kubernetes 92m Normal Pulled pod/logging-loki-query-frontend-6d6859c548-phxp4 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 3.017s (3.017s including waiting). Image size: 225276683 bytes. 92m Normal Pulled pod/logging-loki-gateway-58595d78f8-vq8xm Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 3.01s (3.01s including waiting). Image size: 174532765 bytes. 92m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 2.014s (2.014s including waiting). Image size: 225276683 bytes. 92m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 2.184s (2.184s including waiting). Image size: 225276683 bytes. 92m Normal Pulled pod/logging-loki-querier-76bf7b6d45-nsgkb Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 3.143s (3.143s including waiting). Image size: 225276683 bytes. 92m Normal Pulled pod/logging-loki-gateway-58595d78f8-lmbn4 Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 2.75s (2.75s including waiting). Image size: 174532765 bytes. 92m Normal Created pod/logging-loki-gateway-58595d78f8-vq8xm Created container gateway 92m Normal Pulling pod/logging-loki-gateway-58595d78f8-lmbn4 Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 92m Normal Started pod/logging-loki-gateway-58595d78f8-lmbn4 Started container gateway 92m Normal Created pod/logging-loki-gateway-58595d78f8-lmbn4 Created container gateway 92m Normal Started pod/logging-loki-gateway-58595d78f8-vq8xm Started container gateway 92m Normal Pulling pod/logging-loki-gateway-58595d78f8-vq8xm Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 92m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 92m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 92m Normal Created pod/logging-loki-query-frontend-6d6859c548-phxp4 Created container loki-query-frontend 92m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 92m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 92m Normal Started pod/logging-loki-querier-76bf7b6d45-nsgkb Started container loki-querier 92m Normal Created pod/logging-loki-querier-76bf7b6d45-nsgkb Created container loki-querier 92m Normal Started pod/logging-loki-query-frontend-6d6859c548-phxp4 Started container loki-query-frontend 92m Normal Created pod/logging-loki-gateway-58595d78f8-lmbn4 Created container opa 92m Normal Started pod/logging-loki-gateway-58595d78f8-vq8xm Started container opa 92m Normal Created pod/logging-loki-gateway-58595d78f8-vq8xm Created container opa 92m Normal Pulled pod/logging-loki-gateway-58595d78f8-vq8xm Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 2.009s (2.009s including waiting). Image size: 160762347 bytes. 92m Normal Started pod/logging-loki-gateway-58595d78f8-lmbn4 Started container opa 92m Normal Pulled pod/logging-loki-gateway-58595d78f8-lmbn4 Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 2.016s (2.016s including waiting). Image size: 160762347 bytes. 92m Normal Pulled pod/logging-loki-distributor-5d5548c9f5-8fxrr Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 7.483s (7.483s including waiting). Image size: 225276683 bytes. 92m Normal Created pod/logging-loki-distributor-5d5548c9f5-8fxrr Created container loki-distributor 92m Normal Started pod/logging-loki-distributor-5d5548c9f5-8fxrr Started container loki-distributor 92m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 92m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 7.863s (7.863s including waiting). Image size: 225276683 bytes. 92m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 91m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 91m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 91m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 90m Normal Scheduled pod/collector-4jtbw Successfully assigned openshift-logging/collector-4jtbw to crc 90m Normal SuccessfulCreate daemonset/collector Created pod: collector-4jtbw 90m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-4jtbw 90m Normal SuccessfulCreate daemonset/collector Created pod: collector-x5zk2 90m Normal Scheduled pod/collector-x5zk2 Successfully assigned openshift-logging/collector-x5zk2 to crc 90m Normal AddedInterface pod/collector-x5zk2 Add eth0 [10.217.0.64/23] from ovn-kubernetes 90m Normal Pulling pod/collector-x5zk2 Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" 90m Normal Started pod/collector-x5zk2 Started container collector 90m Normal Created pod/collector-x5zk2 Created container collector 90m Normal Pulled pod/collector-x5zk2 Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" in 6.61s (6.61s including waiting). Image size: 320720510 bytes. 85m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 85m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 85m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 85m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 85m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 6m36s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-8fxrr Readiness probe error: Get "https://10.217.0.54:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m36s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe error: Get "https://10.217.0.58:8081/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m36s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe failed: Get "https://10.217.0.58:8081/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m36s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-8fxrr Readiness probe failed: Get "https://10.217.0.54:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m31s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe failed: Get "https://10.217.0.58:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m21s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe error: Get "https://10.217.0.58:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m21s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe failed: Get "https://10.217.0.58:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m16s Warning ProbeError pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe error: Get "https://10.217.0.57:8083/ready": context deadline exceeded... 6m16s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe error: Get "https://10.217.0.58:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m16s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe failed: Get "https://10.217.0.58:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m16s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe error: Get "https://10.217.0.58:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m16s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe failed: Get "https://10.217.0.58:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m16s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-phxp4 Readiness probe failed: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m16s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-phxp4 Readiness probe error: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m16s Warning ProbeError pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m16s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m16s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe failed: Get "https://10.217.0.57:8083/ready": context deadline exceeded 6m16s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-nsgkb Readiness probe failed: Get "https://10.217.0.55:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m16s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-nsgkb Readiness probe error: Get "https://10.217.0.55:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-phxp4 Liveness probe error: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-8fxrr Liveness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.62:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-nsgkb Liveness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-nsgkb Liveness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-8fxrr Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-phxp4 Liveness probe failed: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-8fxrr Liveness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-8fxrr Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-vq8xm Liveness probe failed: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-gateway-58595d78f8-vq8xm Liveness probe error: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.62:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.61:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-phxp4 Readiness probe error: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-phxp4 Readiness probe failed: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.61:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Liveness probe error: Get "https://10.217.0.58:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m15s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-nsgkb Readiness probe error: Get "https://10.217.0.55:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Liveness probe failed: Get "https://10.217.0.58:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-nsgkb Readiness probe failed: Get "https://10.217.0.55:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-ingester-0 Liveness probe failed: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Liveness probe error: Get "https://10.217.0.58:8081/live": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-ingester-0 Liveness probe error: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-index-gateway-0 Liveness probe failed: Get "https://10.217.0.62:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Liveness probe failed: Get "https://10.217.0.58:8081/live": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-gateway-58595d78f8-vq8xm Liveness probe error: Get "https://10.217.0.57:8081/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-index-gateway-0 Liveness probe error: Get "https://10.217.0.62:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-compactor-0 Liveness probe error: Get "https://10.217.0.61:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-compactor-0 Liveness probe failed: Get "https://10.217.0.61:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-vq8xm Liveness probe failed: Get "https://10.217.0.57:8081/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.62:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.62:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m11s Warning ProbeError pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m11s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe error: Get "https://10.217.0.58:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m11s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe failed: Get "https://10.217.0.58:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m11s Warning ProbeError pod/logging-loki-gateway-58595d78f8-lmbn4 Readiness probe error: Get "https://10.217.0.58:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m11s Warning ProbeError pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m11s Warning Unhealthy pod/logging-loki-gateway-58595d78f8-vq8xm Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m1s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2"