LAST SEEN TYPE REASON OBJECT MESSAGE 72m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.8 requirements not yet checked 72m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.8 one or more requirements couldn't be found 72m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.8 all requirements found, attempting install 72m Normal SuccessfulCreate replicaset/cluster-logging-operator-c769fd969 Created pod: cluster-logging-operator-c769fd969-kl2hc 72m Normal Scheduled pod/cluster-logging-operator-c769fd969-kl2hc Successfully assigned openshift-logging/cluster-logging-operator-c769fd969-kl2hc to crc 72m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-c769fd969 to 1 72m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 waiting for install components to report healthy 72m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.8 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 72m Normal Pulling pod/cluster-logging-operator-c769fd969-kl2hc Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" 72m Normal AddedInterface pod/cluster-logging-operator-c769fd969-kl2hc Add eth0 [10.217.0.49/23] from ovn-kubernetes 72m Normal Created pod/cluster-logging-operator-c769fd969-kl2hc Created container cluster-logging-operator 72m Normal Started pod/cluster-logging-operator-c769fd969-kl2hc Started container cluster-logging-operator 72m Normal Pulled pod/cluster-logging-operator-c769fd969-kl2hc Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:3303e932fe310496b6ae4a8d40a4d6cea0fd93cb4059506d88e774c124cf1b3c" in 11.386s (11.386s including waiting). Image size: 343181526 bytes. 72m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.8 install strategy completed with no errors 72m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 72m Normal SuccessfulCreate replicaset/logging-loki-distributor-5d5548c9f5 Created pod: logging-loki-distributor-5d5548c9f5-2wbxs 72m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 72m Normal Scheduled pod/logging-loki-query-frontend-6d6859c548-x4qsb Successfully assigned openshift-logging/logging-loki-query-frontend-6d6859c548-x4qsb to crc 72m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76bf7b6d45 to 1 72m Normal SuccessfulCreate replicaset/logging-loki-querier-76bf7b6d45 Created pod: logging-loki-querier-76bf7b6d45-t4b27 72m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-0f32b534-5951-4c3e-b9b2-026a0e305786 72m Normal Scheduled pod/logging-loki-gateway-7b58bd6fcd-n8wjk Successfully assigned openshift-logging/logging-loki-gateway-7b58bd6fcd-n8wjk to crc 72m Normal Scheduled pod/logging-loki-querier-76bf7b6d45-t4b27 Successfully assigned openshift-logging/logging-loki-querier-76bf7b6d45-t4b27 to crc 72m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 72m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 72m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 72m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-6d6859c548 to 1 72m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 72m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 72m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 72m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-6d6859c548 Created pod: logging-loki-query-frontend-6d6859c548-x4qsb 72m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 72m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-1d2aca6e-9120-4b67-be38-44ca2808106e 72m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 72m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 72m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 72m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 72m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 72m Normal Scheduled pod/logging-loki-distributor-5d5548c9f5-2wbxs Successfully assigned openshift-logging/logging-loki-distributor-5d5548c9f5-2wbxs to crc 72m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 72m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 72m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 72m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-016eb6bd-b854-4870-a438-a4e9e43742bc 72m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 72m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-32e8d43a-b032-4d7d-9038-e03bff6a896d 72m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 72m Normal SuccessfulCreate replicaset/logging-loki-gateway-7b58bd6fcd Created pod: logging-loki-gateway-7b58bd6fcd-n8wjk 72m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-7b58bd6fcd to 2 72m Normal SuccessfulCreate replicaset/logging-loki-gateway-7b58bd6fcd Created pod: logging-loki-gateway-7b58bd6fcd-58vxq 72m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5d5548c9f5 to 1 72m Normal Scheduled pod/logging-loki-gateway-7b58bd6fcd-58vxq Successfully assigned openshift-logging/logging-loki-gateway-7b58bd6fcd-58vxq to crc 71m Warning FailedMount pod/logging-loki-gateway-7b58bd6fcd-58vxq MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 71m Normal Pulling pod/logging-loki-query-frontend-6d6859c548-x4qsb Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 71m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 71m Normal AddedInterface pod/logging-loki-distributor-5d5548c9f5-2wbxs Add eth0 [10.217.0.52/23] from ovn-kubernetes 71m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 71m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 71m Warning FailedMount pod/logging-loki-gateway-7b58bd6fcd-n8wjk MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 71m Normal AddedInterface pod/logging-loki-querier-76bf7b6d45-t4b27 Add eth0 [10.217.0.53/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-querier-76bf7b6d45-t4b27 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 71m Normal AddedInterface pod/logging-loki-query-frontend-6d6859c548-x4qsb Add eth0 [10.217.0.54/23] from ovn-kubernetes 71m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 71m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 71m Normal Pulling pod/logging-loki-distributor-5d5548c9f5-2wbxs Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 71m Normal Pulling pod/logging-loki-gateway-7b58bd6fcd-n8wjk Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 71m Normal AddedInterface pod/logging-loki-gateway-7b58bd6fcd-n8wjk Add eth0 [10.217.0.55/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 71m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-gateway-7b58bd6fcd-58vxq Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" 71m Normal AddedInterface pod/logging-loki-gateway-7b58bd6fcd-58vxq Add eth0 [10.217.0.56/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" 71m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.086s (6.086s including waiting). Image size: 225276683 bytes. 71m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.007s (6.007s including waiting). Image size: 225276683 bytes. 71m Normal Pulled pod/logging-loki-query-frontend-6d6859c548-x4qsb Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 7.292s (7.292s including waiting). Image size: 225276683 bytes. 71m Normal Pulled pod/logging-loki-querier-76bf7b6d45-t4b27 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 7.226s (7.226s including waiting). Image size: 225276683 bytes. 71m Normal Pulled pod/logging-loki-distributor-5d5548c9f5-2wbxs Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.834s (6.834s including waiting). Image size: 225276683 bytes. 71m Normal Pulled pod/logging-loki-gateway-7b58bd6fcd-n8wjk Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 6.417s (6.418s including waiting). Image size: 174532765 bytes. 71m Normal Pulled pod/logging-loki-gateway-7b58bd6fcd-58vxq Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:0460d39f2624e731821734a679b0235240dc8e107a1986acc8197ab377f628ed" in 6.199s (6.199s including waiting). Image size: 174532765 bytes. 71m Normal Started pod/logging-loki-querier-76bf7b6d45-t4b27 Started container loki-querier 71m Normal Created pod/logging-loki-distributor-5d5548c9f5-2wbxs Created container loki-distributor 71m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 71m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 71m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:c9e7d4b4842e9a0a7f74b2a010c3cfdb433fa00b73ca7a1b86a56c935cdb633d" in 6.023s (6.023s including waiting). Image size: 225276683 bytes. 71m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 71m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 71m Normal Pulling pod/logging-loki-gateway-7b58bd6fcd-n8wjk Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 71m Normal Started pod/logging-loki-distributor-5d5548c9f5-2wbxs Started container loki-distributor 71m Normal Started pod/logging-loki-query-frontend-6d6859c548-x4qsb Started container loki-query-frontend 71m Normal Created pod/logging-loki-query-frontend-6d6859c548-x4qsb Created container loki-query-frontend 71m Normal Started pod/logging-loki-gateway-7b58bd6fcd-58vxq Started container gateway 71m Normal Created pod/logging-loki-gateway-7b58bd6fcd-58vxq Created container gateway 71m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 71m Normal Created pod/logging-loki-querier-76bf7b6d45-t4b27 Created container loki-querier 71m Normal Started pod/logging-loki-gateway-7b58bd6fcd-n8wjk Started container gateway 71m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 71m Normal Created pod/logging-loki-gateway-7b58bd6fcd-n8wjk Created container gateway 71m Normal Pulling pod/logging-loki-gateway-7b58bd6fcd-58vxq Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" 71m Normal Pulled pod/logging-loki-gateway-7b58bd6fcd-n8wjk Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 2.064s (2.064s including waiting). Image size: 160762347 bytes. 71m Normal Pulled pod/logging-loki-gateway-7b58bd6fcd-58vxq Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:13cd0be568d2cac6058c41fb774f1baafc03dc696d57d0eaa612db681818653d" in 2.3s (2.3s including waiting). Image size: 160762347 bytes. 71m Normal Started pod/logging-loki-gateway-7b58bd6fcd-58vxq Started container opa 71m Normal Started pod/logging-loki-gateway-7b58bd6fcd-n8wjk Started container opa 71m Normal Created pod/logging-loki-gateway-7b58bd6fcd-n8wjk Created container opa 71m Normal Created pod/logging-loki-gateway-7b58bd6fcd-58vxq Created container opa 71m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 71m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 71m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 70m Normal SuccessfulCreate daemonset/collector Created pod: collector-zxprd 70m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-zxprd 70m Normal Scheduled pod/collector-zxprd Successfully assigned openshift-logging/collector-zxprd to crc 70m Warning FailedMount pod/collector-zxprd MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 70m Warning FailedMount pod/collector-zxprd MountVolume.SetUp failed for volume "collector-syslog-receiver" : secret "collector-syslog-receiver" not found 70m Normal SuccessfulCreate daemonset/collector Created pod: collector-5tn9r 70m Normal Scheduled pod/collector-5tn9r Successfully assigned openshift-logging/collector-5tn9r to crc 70m Normal AddedInterface pod/collector-5tn9r Add eth0 [10.217.0.83/23] from ovn-kubernetes 70m Normal Pulling pod/collector-5tn9r Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" 70m Normal Started pod/collector-5tn9r Started container collector 70m Normal Created pod/collector-5tn9r Created container collector 70m Normal Pulled pod/collector-5tn9r Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:7683a6c06eb7e40d1338bcf4a3f846b03e4993c0caef74234126696c21551978" in 9.414s (9.414s including waiting). Image size: 320720510 bytes. 65m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 65m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 65m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 65m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 65m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 7m13s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m13s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m13s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m3s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe failed: Get "https://10.217.0.55:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 7m3s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe error: Get "https://10.217.0.55:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m57s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-t4b27 Liveness probe error: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m57s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-58vxq Liveness probe failed: Get "https://10.217.0.56:8083/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m57s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m57s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-2wbxs Liveness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m57s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-2wbxs Liveness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m57s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-n8wjk Liveness probe failed: Get "https://10.217.0.55:8083/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m57s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Liveness probe error: Get "https://10.217.0.55:8083/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m57s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-n8wjk Liveness probe failed: Get "https://10.217.0.55:8081/live": context deadline exceeded 6m57s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Liveness probe error: Get "https://10.217.0.55:8081/live": context deadline exceeded... 6m57s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-x4qsb Liveness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m57s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-x4qsb Liveness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m57s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m57s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-t4b27 Liveness probe failed: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m57s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m57s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-58vxq Liveness probe error: Get "https://10.217.0.56:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m57s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-58vxq Liveness probe failed: Get "https://10.217.0.56:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m57s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-58vxq Liveness probe error: Get "https://10.217.0.56:8083/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m57s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m56s Warning Unhealthy pod/logging-loki-index-gateway-0 Liveness probe failed: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m56s Warning ProbeError pod/logging-loki-index-gateway-0 Liveness probe error: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m53s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m48s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m48s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m48s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe error: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m48s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe failed: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m47s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m47s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m43s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m43s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m43s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe error: Get "https://10.217.0.55:8081/ready": context deadline exceeded... 6m43s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m43s Warning ProbeError pod/logging-loki-gateway-7b58bd6fcd-58vxq Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m43s Warning Unhealthy pod/logging-loki-gateway-7b58bd6fcd-n8wjk Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m38s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-t4b27 Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m38s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-t4b27 Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m37s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m37s Warning ProbeError pod/logging-loki-distributor-5d5548c9f5-2wbxs Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m37s Warning Unhealthy pod/logging-loki-distributor-5d5548c9f5-2wbxs Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m37s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m37s Warning ProbeError pod/logging-loki-querier-76bf7b6d45-t4b27 Readiness probe error: Get "https://10.217.0.53:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m37s Warning Unhealthy pod/logging-loki-query-frontend-6d6859c548-x4qsb Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m37s Warning Unhealthy pod/logging-loki-querier-76bf7b6d45-t4b27 Readiness probe failed: Get "https://10.217.0.53:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m37s Warning ProbeError pod/logging-loki-query-frontend-6d6859c548-x4qsb Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m36s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m36s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m36s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m36s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m36s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m36s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m50s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 5m50s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 5m50s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 5m50s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool