LAST SEEN TYPE REASON OBJECT MESSAGE 87m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.7 requirements not yet checked 87m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.7 one or more requirements couldn't be found 87m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.7 all requirements found, attempting install 87m Normal SuccessfulCreate replicaset/cluster-logging-operator-79cf69ddc8 Created pod: cluster-logging-operator-79cf69ddc8-kx46n 87m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.7 waiting for install components to report healthy 87m Normal Scheduled pod/cluster-logging-operator-79cf69ddc8-kx46n Successfully assigned openshift-logging/cluster-logging-operator-79cf69ddc8-kx46n to crc 87m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-79cf69ddc8 to 1 87m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.7 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 87m Normal AddedInterface pod/cluster-logging-operator-79cf69ddc8-kx46n Add eth0 [10.217.0.48/23] from ovn-kubernetes 87m Warning Failed pod/cluster-logging-operator-79cf69ddc8-kx46n Error: ErrImagePull 87m Warning Failed pod/cluster-logging-operator-79cf69ddc8-kx46n Error: ImagePullBackOff 87m Normal BackOff pod/cluster-logging-operator-79cf69ddc8-kx46n Back-off pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" 87m Warning Failed pod/cluster-logging-operator-79cf69ddc8-kx46n Failed to pull image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3": rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled 87m Normal Pulling pod/cluster-logging-operator-79cf69ddc8-kx46n Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" 87m Normal Started pod/cluster-logging-operator-79cf69ddc8-kx46n Started container cluster-logging-operator 87m Normal Created pod/cluster-logging-operator-79cf69ddc8-kx46n Created container cluster-logging-operator 87m Normal Pulled pod/cluster-logging-operator-79cf69ddc8-kx46n Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:80cac88d8ff5b40036e5983f5dacfc08702afe9c7a66b48d1c88bcb149c285b3" in 3.813s (3.813s including waiting). Image size: 343173849 bytes. 87m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.7 install strategy completed with no errors 86m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 86m Normal Scheduled pod/logging-loki-distributor-5f678c8dd6-k62tn Successfully assigned openshift-logging/logging-loki-distributor-5f678c8dd6-k62tn to crc 86m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 86m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 86m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 86m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 86m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-5f678c8dd6 to 1 86m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 86m Normal SuccessfulCreate replicaset/logging-loki-distributor-5f678c8dd6 Created pod: logging-loki-distributor-5f678c8dd6-k62tn 86m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-69d9546745 Created pod: logging-loki-query-frontend-69d9546745-mlj7c 86m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-ae383121-5983-49ec-84ec-77f80067e382 86m Warning FailedMount pod/logging-loki-gateway-7dbfd5bb68-zslxm MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 86m Normal Scheduled pod/logging-loki-gateway-7dbfd5bb68-zslxm Successfully assigned openshift-logging/logging-loki-gateway-7dbfd5bb68-zslxm to crc 86m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 86m Normal SuccessfulCreate replicaset/logging-loki-gateway-7dbfd5bb68 Created pod: logging-loki-gateway-7dbfd5bb68-zmqn9 86m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-7dbfd5bb68 to 2 86m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 86m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 86m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 86m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 86m Normal AddedInterface pod/logging-loki-distributor-5f678c8dd6-k62tn Add eth0 [10.217.0.52/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-distributor-5f678c8dd6-k62tn Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 86m Normal SuccessfulCreate replicaset/logging-loki-gateway-7dbfd5bb68 Created pod: logging-loki-gateway-7dbfd5bb68-zslxm 86m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-1794901a-7086-4d2e-bae7-3a0de432c19c 86m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 86m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 86m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-69d9546745 to 1 86m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-138ca281-100b-41f5-85ec-60bf09970c08 86m Normal Scheduled pod/logging-loki-querier-76788598db-rg7hv Successfully assigned openshift-logging/logging-loki-querier-76788598db-rg7hv to crc 86m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-7579da74-f173-4a9a-a494-8682cc2887e1 86m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 86m Normal Scheduled pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Successfully assigned openshift-logging/logging-loki-gateway-7dbfd5bb68-zmqn9 to crc 86m Warning FailedMount pod/logging-loki-gateway-7dbfd5bb68-zmqn9 MountVolume.SetUp failed for volume "tls-secret" : secret "logging-loki-gateway-http" not found 86m Normal SuccessfulCreate replicaset/logging-loki-querier-76788598db Created pod: logging-loki-querier-76788598db-rg7hv 86m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-76788598db to 1 86m Normal Scheduled pod/logging-loki-query-frontend-69d9546745-mlj7c Successfully assigned openshift-logging/logging-loki-query-frontend-69d9546745-mlj7c to crc 86m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 86m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 86m Normal NoPods poddisruptionbudget/logging-loki-index-gateway No matching pods found 86m Normal AddedInterface pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Add eth0 [10.217.0.55/23] from ovn-kubernetes 86m Normal AddedInterface pod/logging-loki-querier-76788598db-rg7hv Add eth0 [10.217.0.53/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-query-frontend-69d9546745-mlj7c Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 86m Normal AddedInterface pod/logging-loki-query-frontend-69d9546745-mlj7c Add eth0 [10.217.0.54/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-gateway-7dbfd5bb68-zslxm Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" 86m Normal Pulling pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" 86m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 86m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 86m Normal AddedInterface pod/logging-loki-gateway-7dbfd5bb68-zslxm Add eth0 [10.217.0.56/23] from ovn-kubernetes 86m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-querier-76788598db-rg7hv Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 86m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 86m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 86m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 86m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 86m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" 86m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 86m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.778s (3.778s including waiting). Image size: 225282830 bytes. 86m Normal Pulled pod/logging-loki-distributor-5f678c8dd6-k62tn Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 5.071s (5.071s including waiting). Image size: 225282830 bytes. 86m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.918s (3.918s including waiting). Image size: 225282830 bytes. 86m Normal Pulled pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" in 4.337s (4.337s including waiting). Image size: 180263561 bytes. 86m Normal Pulled pod/logging-loki-gateway-7dbfd5bb68-zslxm Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:574afd46f23637b1685ddb00b647bb71d2e2d527611182fd2826bb818c5eb198" in 4.073s (4.073s including waiting). Image size: 180263561 bytes. 86m Normal Created pod/logging-loki-query-frontend-69d9546745-mlj7c Created container loki-query-frontend 86m Normal Started pod/logging-loki-gateway-7dbfd5bb68-zslxm Started container gateway 86m Normal Pulling pod/logging-loki-gateway-7dbfd5bb68-zslxm Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" 86m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 86m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 86m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 3.833s (3.833s including waiting). Image size: 225282830 bytes. 86m Normal Created pod/logging-loki-gateway-7dbfd5bb68-zslxm Created container gateway 86m Normal Pulled pod/logging-loki-querier-76788598db-rg7hv Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 4.839s (4.839s including waiting). Image size: 225282830 bytes. 86m Normal Created pod/logging-loki-querier-76788598db-rg7hv Created container loki-querier 86m Normal Started pod/logging-loki-querier-76788598db-rg7hv Started container loki-querier 86m Normal Pulled pod/logging-loki-query-frontend-69d9546745-mlj7c Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:8c073f8de8288bf63489a400d7f93e96562652832965d33e0515f48d46282459" in 4.744s (4.744s including waiting). Image size: 225282830 bytes. 86m Normal Pulling pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" 86m Normal Started pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Started container gateway 86m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 86m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 86m Normal Created pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Created container gateway 86m Normal Started pod/logging-loki-query-frontend-69d9546745-mlj7c Started container loki-query-frontend 86m Normal Started pod/logging-loki-distributor-5f678c8dd6-k62tn Started container loki-distributor 86m Normal Created pod/logging-loki-distributor-5f678c8dd6-k62tn Created container loki-distributor 86m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 86m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 86m Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe error: Get "https://10.217.0.56:8083/ready": dial tcp 10.217.0.56:8083: connect: connection refused... 86m Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe failed: Get "https://10.217.0.56:8083/ready": dial tcp 10.217.0.56:8083: connect: connection refused 86m Normal Created pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Created container opa 86m Normal Pulled pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" in 2.071s (2.071s including waiting). Image size: 160768505 bytes. 86m Normal Pulled pod/logging-loki-gateway-7dbfd5bb68-zslxm Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:182dc5ab627d64b5e565a2edb170903ac381fc534b4366fc5ffae088ea0de5e5" in 2.015s (2.015s including waiting). Image size: 160768505 bytes. 86m Normal Created pod/logging-loki-gateway-7dbfd5bb68-zslxm Created container opa 86m Normal Started pod/logging-loki-gateway-7dbfd5bb68-zslxm Started container opa 86m Normal Started pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Started container opa 86m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 86m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 86m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 85m Normal Scheduled pod/collector-wc92p Successfully assigned openshift-logging/collector-wc92p to crc 85m Normal SuccessfulCreate daemonset/collector Created pod: collector-wc92p 85m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-wc92p 85m Normal SuccessfulCreate daemonset/collector Created pod: collector-b68pd 85m Normal Scheduled pod/collector-b68pd Successfully assigned openshift-logging/collector-b68pd to crc 85m Normal Pulling pod/collector-b68pd Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:fa2cfa2ed336ce105c8dea5bfe0825407e37ef296193ae162f515213fe43c8d5" 85m Normal AddedInterface pod/collector-b68pd Add eth0 [10.217.0.63/23] from ovn-kubernetes 85m Normal Started pod/collector-b68pd Started container collector 85m Normal Created pod/collector-b68pd Created container collector 85m Normal Pulled pod/collector-b68pd Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:fa2cfa2ed336ce105c8dea5bfe0825407e37ef296193ae162f515213fe43c8d5" in 6.654s (6.654s including waiting). Image size: 344557702 bytes. 79m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 79m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 79m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 79m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 79m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 7m40s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-k62tn Readiness probe failed: Get "https://10.217.0.52:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 7m40s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-k62tn Readiness probe error: Get "https://10.217.0.52:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 7m39s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m39s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m34s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe failed: Get "https://10.217.0.56:8081/ready": context deadline exceeded 7m34s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe error: Get "https://10.217.0.56:8081/ready": context deadline exceeded... 7m34s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe failed: Get "https://10.217.0.55:8083/ready": context deadline exceeded 7m34s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe error: Get "https://10.217.0.55:8083/ready": context deadline exceeded... 7m34s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe failed: Get "https://10.217.0.56:8083/ready": context deadline exceeded 7m34s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe error: Get "https://10.217.0.56:8083/ready": context deadline exceeded... 7m34s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe error: Get "https://10.217.0.55:8081/ready": context deadline exceeded... 7m34s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe failed: Get "https://10.217.0.55:8081/ready": context deadline exceeded 7m30s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-k62tn Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m30s Warning Unhealthy pod/logging-loki-querier-76788598db-rg7hv Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m30s Warning ProbeError pod/logging-loki-querier-76788598db-rg7hv Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m30s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-k62tn Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m30s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-mlj7c Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m30s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-mlj7c Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m29s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m29s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m29s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m29s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m29s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m29s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m24s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe error: Get "https://10.217.0.55:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 7m24s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe failed: Get "https://10.217.0.55:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 7m24s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 7m24s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 7m20s Warning ProbeError pod/logging-loki-querier-76788598db-rg7hv Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m20s Warning Unhealthy pod/logging-loki-distributor-5f678c8dd6-k62tn Readiness probe failed: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m20s Warning ProbeError pod/logging-loki-distributor-5f678c8dd6-k62tn Readiness probe error: Get "https://10.217.0.52:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m20s Warning Unhealthy pod/logging-loki-querier-76788598db-rg7hv Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m20s Warning Unhealthy pod/logging-loki-query-frontend-69d9546745-mlj7c Readiness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m20s Warning ProbeError pod/logging-loki-query-frontend-69d9546745-mlj7c Readiness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m19s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe error: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m19s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m19s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m19s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe failed: Get "https://10.217.0.55:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m19s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe error: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m19s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Readiness probe failed: Get "https://10.217.0.55:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m19s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m19s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m18s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zslxm Liveness probe failed: Get "https://10.217.0.56:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m18s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Liveness probe error: Get "https://10.217.0.55:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m18s Warning Unhealthy pod/logging-loki-gateway-7dbfd5bb68-zmqn9 Liveness probe failed: Get "https://10.217.0.55:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 7m18s Warning ProbeError pod/logging-loki-gateway-7dbfd5bb68-zslxm Liveness probe error: Get "https://10.217.0.56:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 7m9s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2"