LAST SEEN TYPE REASON OBJECT MESSAGE 103m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.7 requirements not yet checked 103m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.7 one or more requirements couldn't be found 103m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.7 all requirements found, attempting install 103m Normal SuccessfulCreate replicaset/cluster-logging-operator-79cf69ddc8 Created pod: cluster-logging-operator-79cf69ddc8-d28w5 103m Normal Scheduled pod/cluster-logging-operator-79cf69ddc8-d28w5 Successfully assigned openshift-logging/cluster-logging-operator-79cf69ddc8-d28w5 to crc 103m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-79cf69ddc8 to 1 103m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.7 waiting for install components to report healthy 103m Normal Pulling pod/cluster-logging-operator-79cf69ddc8-d28w5 Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" 103m Normal AddedInterface pod/cluster-logging-operator-79cf69ddc8-d28w5 Add eth0 [10.217.0.49/23] from ovn-kubernetes 103m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.7 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 103m Normal Created pod/cluster-logging-operator-79cf69ddc8-d28w5 Created container cluster-logging-operator 103m Normal Started pod/cluster-logging-operator-79cf69ddc8-d28w5 Started container cluster-logging-operator 103m Normal Pulled pod/cluster-logging-operator-79cf69ddc8-d28w5 Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" in 11.678s (11.678s including waiting). Image size: 343173849 bytes. 103m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.7 install strategy completed with no errors 102m Normal Scheduled pod/logging-loki-gateway-76696895d9-g5tqr Successfully assigned openshift-logging/logging-loki-gateway-76696895d9-g5tqr to crc 102m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 102m Normal Scheduled pod/logging-loki-query-frontend-69d9546745-pcd6x Successfully assigned openshift-logging/logging-loki-query-frontend-69d9546745-pcd6x to crc 102m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76788598db to 1 102m Normal SuccessfulCreate replicaset/logging-loki-querier-76788598db Created pod: logging-loki-querier-76788598db-dkn9m 102m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-69d9546745 Created pod: logging-loki-query-frontend-69d9546745-pcd6x 102m Normal Scheduled pod/logging-loki-querier-76788598db-dkn9m Successfully assigned openshift-logging/logging-loki-querier-76788598db-dkn9m to crc 102m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 102m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 102m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 102m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 102m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-69d9546745 to 1 102m Normal NoPods poddisruptionbudget/logging-loki-index-gateway No matching pods found 102m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 102m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-76696895d9 to 2 102m Normal SuccessfulCreate replicaset/logging-loki-gateway-76696895d9 Created pod: logging-loki-gateway-76696895d9-c6d96 102m Normal SuccessfulCreate replicaset/logging-loki-gateway-76696895d9 Created pod: logging-loki-gateway-76696895d9-g5tqr 102m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-compactor-0 waiting for first consumer to be created before binding 102m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 102m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 102m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-fca63622-5aca-4efb-a7fe-bb443a1c1f59 102m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 102m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 102m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 102m Normal Scheduled pod/logging-loki-distributor-5f678c8dd6-2755m Successfully assigned openshift-logging/logging-loki-distributor-5f678c8dd6-2755m to crc 102m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-7d3bb0be-7a81-454c-ac38-c6ad37f0ea95 102m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 102m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 102m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 102m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-ed1092d2-65bc-47b0-81f9-72627d9feec9 102m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 102m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 102m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 102m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 102m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 102m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-e5484364-652f-4506-b78b-405e87866424 102m Normal SuccessfulCreate replicaset/logging-loki-distributor-5f678c8dd6 Created pod: logging-loki-distributor-5f678c8dd6-2755m 102m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5f678c8dd6 to 1 102m Normal Scheduled pod/logging-loki-gateway-76696895d9-c6d96 Successfully assigned openshift-logging/logging-loki-gateway-76696895d9-c6d96 to crc 102m Normal AddedInterface pod/logging-loki-query-frontend-69d9546745-pcd6x Add eth0 [10.217.0.54/23] from ovn-kubernetes 102m Normal Pulling pod/logging-loki-gateway-76696895d9-c6d96 Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" 102m Normal AddedInterface pod/logging-loki-gateway-76696895d9-c6d96 Add eth0 [10.217.0.56/23] from ovn-kubernetes 102m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 102m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 102m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 102m Normal AddedInterface pod/logging-loki-distributor-5f678c8dd6-2755m Add eth0 [10.217.0.52/23] from ovn-kubernetes 102m Normal Pulling pod/logging-loki-gateway-76696895d9-g5tqr Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" 102m Normal AddedInterface pod/logging-loki-gateway-76696895d9-g5tqr Add eth0 [10.217.0.55/23] from ovn-kubernetes 102m Normal Pulling pod/logging-loki-distributor-5f678c8dd6-2755m Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 102m Normal AddedInterface pod/logging-loki-querier-76788598db-dkn9m Add eth0 [10.217.0.53/23] from ovn-kubernetes 102m Normal Pulling pod/logging-loki-query-frontend-69d9546745-pcd6x Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 102m Normal Pulling pod/logging-loki-querier-76788598db-dkn9m Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 102m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 102m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 102m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 102m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 102m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 102m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 102m Normal Started pod/logging-loki-query-frontend-69d9546745-pcd6x Started container loki-query-frontend 102m Normal Pulling pod/logging-loki-gateway-76696895d9-g5tqr Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" 102m Normal Pulled pod/logging-loki-query-frontend-69d9546745-pcd6x Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.741s (3.741s including waiting). Image size: 225282830 bytes. 102m Normal Started pod/logging-loki-gateway-76696895d9-g5tqr Started container gateway 102m Normal Created pod/logging-loki-query-frontend-69d9546745-pcd6x Created container loki-query-frontend 102m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 2.894s (2.894s including waiting). Image size: 225282830 bytes. 102m Normal Pulled pod/logging-loki-gateway-76696895d9-g5tqr Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" in 3.626s (3.626s including waiting). Image size: 180263561 bytes. 102m Normal Created pod/logging-loki-gateway-76696895d9-g5tqr Created container gateway 102m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 102m Normal Started pod/logging-loki-querier-76788598db-dkn9m Started container loki-querier 102m Normal Pulling pod/logging-loki-gateway-76696895d9-c6d96 Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" 102m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 102m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 2.957s (2.957s including waiting). Image size: 225282830 bytes. 102m Normal Started pod/logging-loki-distributor-5f678c8dd6-2755m Started container loki-distributor 102m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 102m Normal Pulled pod/logging-loki-querier-76788598db-dkn9m Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.964s (3.964s including waiting). Image size: 225282830 bytes. 102m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 102m Normal Started pod/logging-loki-gateway-76696895d9-c6d96 Started container gateway 102m Normal Created pod/logging-loki-querier-76788598db-dkn9m Created container loki-querier 102m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 102m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 102m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 2.824s (2.824s including waiting). Image size: 225282830 bytes. 102m Normal Pulled pod/logging-loki-distributor-5f678c8dd6-2755m Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.721s (3.721s including waiting). Image size: 225282830 bytes. 102m Normal Created pod/logging-loki-distributor-5f678c8dd6-2755m Created container loki-distributor 102m Normal Created pod/logging-loki-gateway-76696895d9-c6d96 Created container gateway 102m Normal Pulled pod/logging-loki-gateway-76696895d9-c6d96 Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" in 3.534s (3.534s including waiting). Image size: 180263561 bytes. 102m Normal Created pod/logging-loki-gateway-76696895d9-c6d96 Created container opa 102m Normal Created pod/logging-loki-gateway-76696895d9-g5tqr Created container opa 102m Normal Pulled pod/logging-loki-gateway-76696895d9-c6d96 Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" in 2.915s (2.915s including waiting). Image size: 160768505 bytes. 102m Normal Started pod/logging-loki-gateway-76696895d9-c6d96 Started container opa 102m Normal Started pod/logging-loki-gateway-76696895d9-g5tqr Started container opa 102m Normal Pulled pod/logging-loki-gateway-76696895d9-g5tqr Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" in 2.874s (2.874s including waiting). Image size: 160768505 bytes. 101m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 101m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 101m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 101m Warning FailedMount pod/collector-pg6pj MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 101m Normal Scheduled pod/collector-pg6pj Successfully assigned openshift-logging/collector-pg6pj to crc 101m Warning FailedMount pod/collector-pg6pj MountVolume.SetUp failed for volume "collector-syslog-receiver" : secret "collector-syslog-receiver" not found 101m Normal SuccessfulCreate daemonset/collector Created pod: collector-pg6pj 101m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-pg6pj 101m Normal SuccessfulCreate daemonset/collector Created pod: collector-gthjs 101m Normal Scheduled pod/collector-gthjs Successfully assigned openshift-logging/collector-gthjs to crc 101m Normal AddedInterface pod/collector-gthjs Add eth0 [10.217.0.63/23] from ovn-kubernetes 101m Normal Pulling pod/collector-gthjs Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:fa2cfa2ed336ce105c8dea5bfe0825407e37ef296193ae162f515213fe43c8d5" 100m Normal Started pod/collector-gthjs Started container collector 100m Normal Created pod/collector-gthjs Created container collector 100m Normal Pulled pod/collector-gthjs Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:fa2cfa2ed336ce105c8dea5bfe0825407e37ef296193ae162f515213fe43c8d5" in 9.605s (9.605s including waiting). Image size: 344557702 bytes. 94m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 94m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 94m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 94m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 94m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 5m53s Warning Unhealthy pod/logging-loki-querier-76788598db-dkn9m Liveness probe failed: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning Unhealthy pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe failed: Get "https://10.217.0.56:8083/ready": context deadline exceeded 5m53s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-pcd6x Liveness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-2755m Liveness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe error: Get "https://10.217.0.55:8081/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 5m53s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-2755m Liveness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe error: Get "https://10.217.0.56:8083/ready": context deadline exceeded... 5m53s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe failed: Get "https://10.217.0.55:8081/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 5m53s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-pcd6x Liveness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m53s Warning ProbeError pod/logging-loki-querier-76788598db-dkn9m Liveness probe error: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m53s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m52s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Liveness probe error: Get "https://10.217.0.56:8081/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m52s Warning Unhealthy pod/logging-loki-gateway-76696895d9-c6d96 Liveness probe failed: Get "https://10.217.0.56:8081/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m52s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Liveness probe error: Get "https://10.217.0.55:8081/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m52s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Liveness probe error: Get "https://10.217.0.56:8083/live": context deadline exceeded... 5m52s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Liveness probe failed: Get "https://10.217.0.55:8081/live": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m52s Warning Unhealthy pod/logging-loki-gateway-76696895d9-c6d96 Liveness probe failed: Get "https://10.217.0.56:8083/live": context deadline exceeded 5m52s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Liveness probe failed: Get "https://10.217.0.55:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m52s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Liveness probe error: Get "https://10.217.0.55:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m44s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-2755m Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m44s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-pcd6x Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m44s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-pcd6x Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m44s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-2755m Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m43s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe error: Get "https://10.217.0.55:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m43s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m43s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe failed: Get "https://10.217.0.55:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m43s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m43s Warning Unhealthy pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m43s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m38s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m38s Warning Unhealthy pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m38s Warning Unhealthy pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe failed: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m38s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m38s Warning ProbeError pod/logging-loki-gateway-76696895d9-g5tqr Readiness probe error: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m38s Warning Unhealthy pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m38s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m34s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-2755m Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m34s Warning Unhealthy pod/logging-loki-querier-76788598db-dkn9m Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 5m34s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-pcd6x Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 5m34s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-pcd6x Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m34s Warning ProbeError pod/logging-loki-querier-76788598db-dkn9m Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 5m34s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-2755m Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 5m33s Warning ProbeError pod/logging-loki-gateway-76696895d9-c6d96 Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 4m43s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 4m43s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 4m43s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 4m43s Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip