LAST SEEN TYPE REASON OBJECT MESSAGE 72m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.9 requirements not yet checked 72m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.9 one or more requirements couldn't be found 72m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.9 all requirements found, attempting install 72m Normal Scheduled pod/cluster-logging-operator-66689c4bbf-vjzsc Successfully assigned openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc to crc 72m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.9 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 72m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 waiting for install components to report healthy 72m Normal SuccessfulCreate replicaset/cluster-logging-operator-66689c4bbf Created pod: cluster-logging-operator-66689c4bbf-vjzsc 72m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-66689c4bbf to 1 72m Normal AddedInterface pod/cluster-logging-operator-66689c4bbf-vjzsc Add eth0 [10.217.0.48/23] from ovn-kubernetes 72m Normal Pulling pod/cluster-logging-operator-66689c4bbf-vjzsc Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" 71m Normal Started pod/cluster-logging-operator-66689c4bbf-vjzsc Started container cluster-logging-operator 71m Normal Created pod/cluster-logging-operator-66689c4bbf-vjzsc Created container cluster-logging-operator 71m Normal Pulled pod/cluster-logging-operator-66689c4bbf-vjzsc Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" in 12.451s (12.451s including waiting). Image size: 343887794 bytes. 71m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 install strategy completed with no errors 71m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 71m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 71m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 71m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 71m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 71m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-9c6b6d984 to 1 71m Normal SuccessfulCreate replicaset/logging-loki-distributor-9c6b6d984 Created pod: logging-loki-distributor-9c6b6d984-rznlv 71m Normal Scheduled pod/logging-loki-distributor-9c6b6d984-rznlv Successfully assigned openshift-logging/logging-loki-distributor-9c6b6d984-rznlv to crc 71m Normal Pulling pod/logging-loki-gateway-5d45f4dcf6-hkx57 Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 71m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 71m Normal Scheduled pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Successfully assigned openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9 to crc 71m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-6dcbdf8bb8 to 1 71m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 71m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 71m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-20552b06-44ca-49b7-b773-634d5b66321b 71m Normal NoPods poddisruptionbudget/logging-loki-querier No matching pods found 71m Normal SuccessfulCreate replicaset/logging-loki-querier-6dcbdf8bb8 Created pod: logging-loki-querier-6dcbdf8bb8-cm5jv 71m Normal Pulling pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 71m Normal Pulling pod/logging-loki-querier-6dcbdf8bb8-cm5jv Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 71m Normal AddedInterface pod/logging-loki-querier-6dcbdf8bb8-cm5jv Add eth0 [10.217.0.54/23] from ovn-kubernetes 71m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-ad328127-08a3-4d3c-b4cd-28a836727a41 71m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 71m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 71m Normal Scheduled pod/logging-loki-querier-6dcbdf8bb8-cm5jv Successfully assigned openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv to crc 71m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 71m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 71m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 71m Normal AddedInterface pod/logging-loki-distributor-9c6b6d984-rznlv Add eth0 [10.217.0.53/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-distributor-9c6b6d984-rznlv Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 71m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-ff66c4dc9 Created pod: logging-loki-query-frontend-ff66c4dc9-5qfh9 71m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-ff66c4dc9 to 1 71m Normal Scheduled pod/logging-loki-gateway-5d45f4dcf6-hkx57 Successfully assigned openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57 to crc 71m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2 71m Normal AddedInterface pod/logging-loki-gateway-5d45f4dcf6-hkx57 Add eth0 [10.217.0.57/23] from ovn-kubernetes 71m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 71m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 71m Normal AddedInterface pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Add eth0 [10.217.0.55/23] from ovn-kubernetes 71m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 71m Normal Scheduled pod/logging-loki-gateway-5d45f4dcf6-4f49c Successfully assigned openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c to crc 71m Normal AddedInterface pod/logging-loki-gateway-5d45f4dcf6-4f49c Add eth0 [10.217.0.56/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-gateway-5d45f4dcf6-4f49c Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 71m Normal NoPods poddisruptionbudget/logging-loki-index-gateway No matching pods found 71m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 71m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1 71m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 71m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 71m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-5d45f4dcf6 to 2 71m Normal SuccessfulCreate replicaset/logging-loki-gateway-5d45f4dcf6 Created pod: logging-loki-gateway-5d45f4dcf6-hkx57 71m Normal SuccessfulCreate replicaset/logging-loki-gateway-5d45f4dcf6 Created pod: logging-loki-gateway-5d45f4dcf6-4f49c 71m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-compactor-0 waiting for first consumer to be created before binding 71m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 71m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 71m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.58/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 71m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 71m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.61/23] from ovn-kubernetes 71m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 71m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 71m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.60/23] from ovn-kubernetes 71m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.641s (2.641s including waiting). Image size: 225648085 bytes. 71m Normal Pulled pod/logging-loki-distributor-9c6b6d984-rznlv Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.658s (3.658s including waiting). Image size: 225648085 bytes. 71m Normal Started pod/logging-loki-distributor-9c6b6d984-rznlv Started container loki-distributor 71m Normal Pulled pod/logging-loki-gateway-5d45f4dcf6-4f49c Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 3.521s (3.521s including waiting). Image size: 181406084 bytes. 71m Normal Pulled pod/logging-loki-gateway-5d45f4dcf6-hkx57 Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 3.407s (3.407s including waiting). Image size: 181406084 bytes. 71m Normal Created pod/logging-loki-gateway-5d45f4dcf6-hkx57 Created container gateway 71m Normal Started pod/logging-loki-gateway-5d45f4dcf6-hkx57 Started container gateway 71m Normal Pulling pod/logging-loki-gateway-5d45f4dcf6-hkx57 Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 71m Normal Created pod/logging-loki-gateway-5d45f4dcf6-4f49c Created container gateway 71m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 71m Normal Created pod/logging-loki-distributor-9c6b6d984-rznlv Created container loki-distributor 71m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.369s (2.369s including waiting). Image size: 225648085 bytes. 71m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 71m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 71m Normal Pulled pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.607s (3.607s including waiting). Image size: 225648085 bytes. 71m Normal Created pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Created container loki-query-frontend 71m Normal Started pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Started container loki-query-frontend 71m Normal Started pod/logging-loki-querier-6dcbdf8bb8-cm5jv Started container loki-querier 71m Normal Created pod/logging-loki-querier-6dcbdf8bb8-cm5jv Created container loki-querier 71m Normal Pulled pod/logging-loki-querier-6dcbdf8bb8-cm5jv Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 3.582s (3.582s including waiting). Image size: 225648085 bytes. 71m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 71m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 2.306s (2.306s including waiting). Image size: 225648085 bytes. 71m Normal Started pod/logging-loki-gateway-5d45f4dcf6-4f49c Started container gateway 71m Normal Pulling pod/logging-loki-gateway-5d45f4dcf6-4f49c Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 71m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 71m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 71m Normal Started pod/logging-loki-gateway-5d45f4dcf6-hkx57 Started container opa 71m Normal Pulled pod/logging-loki-gateway-5d45f4dcf6-hkx57 Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.133s (2.133s including waiting). Image size: 161395402 bytes. 71m Normal Started pod/logging-loki-gateway-5d45f4dcf6-4f49c Started container opa 71m Normal Created pod/logging-loki-gateway-5d45f4dcf6-4f49c Created container opa 71m Normal Pulled pod/logging-loki-gateway-5d45f4dcf6-4f49c Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.154s (2.154s including waiting). Image size: 161395402 bytes. 71m Normal Created pod/logging-loki-gateway-5d45f4dcf6-hkx57 Created container opa 70m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 70m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 70m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 69m Normal Scheduled pod/collector-rxpth Successfully assigned openshift-logging/collector-rxpth to crc 69m Warning FailedMount pod/collector-rxpth MountVolume.SetUp failed for volume "collector-syslog-receiver" : secret "collector-syslog-receiver" not found 69m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-rxpth 69m Normal SuccessfulCreate daemonset/collector Created pod: collector-rxpth 69m Warning FailedMount pod/collector-rxpth MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 69m Normal Scheduled pod/collector-txhkk Successfully assigned openshift-logging/collector-txhkk to crc 69m Normal SuccessfulCreate daemonset/collector Created pod: collector-txhkk 69m Normal AddedInterface pod/collector-txhkk Add eth0 [10.217.0.69/23] from ovn-kubernetes 69m Normal Pulling pod/collector-txhkk Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" 69m Normal Started pod/collector-txhkk Started container collector 69m Normal Created pod/collector-txhkk Created container collector 69m Normal Pulled pod/collector-txhkk Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" in 3.645s (3.645s including waiting). Image size: 263139944 bytes. 64m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 64m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 64m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 64m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 64m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 6m34s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m34s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m34s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m34s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m34s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.61:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m34s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m25s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Readiness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m25s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-cm5jv Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m25s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m25s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m25s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-cm5jv Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m25s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Readiness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m20s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m20s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m20s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m20s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m20s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe error: Get "https://10.217.0.57:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m20s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m20s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m20s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe failed: Get "https://10.217.0.57:8083/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-cm5jv Readiness probe error: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe failed: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe error: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe failed: Get "https://10.217.0.56:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe failed: Get "https://10.217.0.57:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe error: Get "https://10.217.0.57:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m15s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe failed: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m15s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-cm5jv Readiness probe failed: Get "https://10.217.0.54:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-rznlv Readiness probe failed: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Liveness probe failed: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-cm5jv Readiness probe error: Get "https://10.217.0.54:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-cm5jv Liveness probe failed: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-cm5jv Liveness probe error: Get "https://10.217.0.54:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Readiness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Readiness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.60:3101/ready": context deadline exceeded... 6m14s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.60:3101/ready": context deadline exceeded 6m14s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Liveness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 Liveness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.58:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-rznlv Readiness probe error: Get "https://10.217.0.53:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-rznlv Liveness probe error: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.61:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.61:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-cm5jv Readiness probe failed: Get "https://10.217.0.54:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m14s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-rznlv Liveness probe failed: Get "https://10.217.0.53:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Liveness probe error: Get "https://10.217.0.57:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-hkx57 Liveness probe failed: Get "https://10.217.0.57:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Liveness probe error: Get "https://10.217.0.57:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-4f49c Liveness probe error: Get "https://10.217.0.56:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-4f49c Liveness probe failed: Get "https://10.217.0.56:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m14s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-4f49c Liveness probe error: Get "https://10.217.0.56:8081/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m14s Warning Unhealthy pod/logging-loki-gateway-5d45f4dcf6-4f49c Liveness probe failed: Get "https://10.217.0.56:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m13s Warning ProbeError pod/logging-loki-index-gateway-0 Liveness probe error: Get "https://10.217.0.61:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m13s Warning Unhealthy pod/logging-loki-index-gateway-0 Liveness probe failed: Get "https://10.217.0.61:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m13s Warning Unhealthy pod/logging-loki-compactor-0 Liveness probe failed: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m13s Warning ProbeError pod/logging-loki-compactor-0 Liveness probe error: Get "https://10.217.0.60:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m13s Warning ProbeError pod/logging-loki-ingester-0 Liveness probe error: Get "https://10.217.0.58:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m13s Warning Unhealthy pod/logging-loki-ingester-0 Liveness probe failed: Get "https://10.217.0.58:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m5s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-hkx57 Readiness probe error: Get "https://10.217.0.57:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m5s Warning ProbeError pod/logging-loki-gateway-5d45f4dcf6-4f49c Readiness probe error: Get "https://10.217.0.56:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m3s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2"