LAST SEEN TYPE REASON OBJECT MESSAGE 76m Normal RequirementsUnknown clusterserviceversion/cluster-logging.v6.2.9 requirements not yet checked 76m Normal RequirementsNotMet clusterserviceversion/cluster-logging.v6.2.9 one or more requirements couldn't be found 76m Normal AllRequirementsMet clusterserviceversion/cluster-logging.v6.2.9 all requirements found, attempting install 76m Normal SuccessfulCreate replicaset/cluster-logging-operator-66689c4bbf Created pod: cluster-logging-operator-66689c4bbf-9lr46 76m Normal Scheduled pod/cluster-logging-operator-66689c4bbf-9lr46 Successfully assigned openshift-logging/cluster-logging-operator-66689c4bbf-9lr46 to crc 76m Normal ScalingReplicaSet deployment/cluster-logging-operator Scaled up replica set cluster-logging-operator-66689c4bbf to 1 76m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 waiting for install components to report healthy 76m Normal Pulling pod/cluster-logging-operator-66689c4bbf-9lr46 Pulling image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" 76m Normal AddedInterface pod/cluster-logging-operator-66689c4bbf-9lr46 Add eth0 [10.217.0.46/23] from ovn-kubernetes 76m Normal InstallWaiting clusterserviceversion/cluster-logging.v6.2.9 installing: waiting for deployment cluster-logging-operator to become ready: deployment "cluster-logging-operator" not available: Deployment does not have minimum availability. 76m Normal Pulled pod/cluster-logging-operator-66689c4bbf-9lr46 Successfully pulled image "registry.redhat.io/openshift-logging/cluster-logging-rhel9-operator@sha256:bf86702a43f0ab4fda541a3aa8ace4560088be90be5eb59cd475ce1d97f5afef" in 11.286s (11.286s including waiting). Image size: 343887794 bytes. 76m Normal Created pod/cluster-logging-operator-66689c4bbf-9lr46 Created container cluster-logging-operator 76m Normal Started pod/cluster-logging-operator-66689c4bbf-9lr46 Started container cluster-logging-operator 76m Normal InstallSucceeded clusterserviceversion/cluster-logging.v6.2.9 install strategy completed with no errors 76m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-ingester-0 Successfully provisioned volume pvc-a753d979-56ef-4bd1-83cd-07087006d1f2 76m Normal ScalingReplicaSet deployment/logging-loki-querier Scaled up replica set logging-loki-querier-6dcbdf8bb8 to 1 76m Normal NoPods poddisruptionbudget/logging-loki-query-frontend No matching pods found 76m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway successful 76m Normal NoPods poddisruptionbudget/logging-loki-index-gateway No matching pods found 76m Normal SuccessfulCreate statefulset/logging-loki-index-gateway create Claim storage-logging-loki-index-gateway-0 Pod logging-loki-index-gateway-0 in StatefulSet logging-loki-index-gateway success 76m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim wal-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 76m Normal NoPods poddisruptionbudget/logging-loki-ingester No matching pods found 76m Normal SuccessfulCreate statefulset/logging-loki-ingester create Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester successful 76m Normal Scheduled pod/logging-loki-querier-6dcbdf8bb8-jgbgh Successfully assigned openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh to crc 76m Normal SuccessfulCreate replicaset/logging-loki-query-frontend-ff66c4dc9 Created pod: logging-loki-query-frontend-ff66c4dc9-tcbf4 76m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-compactor-0 waiting for first consumer to be created before binding 76m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-compactor-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 76m Normal Provisioning persistentvolumeclaim/storage-logging-loki-compactor-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-compactor-0" 76m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-compactor-0 Successfully provisioned volume pvc-1fead652-bd08-4c9d-980a-e6862a1633b2 76m Normal Scheduled pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Successfully assigned openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4 to crc 76m Normal SuccessfulCreate statefulset/logging-loki-ingester create Claim storage-logging-loki-ingester-0 Pod logging-loki-ingester-0 in StatefulSet logging-loki-ingester success 76m Normal Provisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-index-gateway-0" 76m Normal ProvisioningSucceeded persistentvolumeclaim/wal-logging-loki-ingester-0 Successfully provisioned volume pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f 76m Normal Provisioning persistentvolumeclaim/wal-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/wal-logging-loki-ingester-0" 76m Normal ExternalProvisioning persistentvolumeclaim/wal-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 76m Normal WaitForFirstConsumer persistentvolumeclaim/wal-logging-loki-ingester-0 waiting for first consumer to be created before binding 76m Normal ScalingReplicaSet deployment/logging-loki-query-frontend Scaled up replica set logging-loki-query-frontend-ff66c4dc9 to 1 76m Normal Provisioning persistentvolumeclaim/storage-logging-loki-ingester-0 External provisioner is provisioning volume for claim "openshift-logging/storage-logging-loki-ingester-0" 76m Normal SuccessfulCreate statefulset/logging-loki-compactor create Claim storage-logging-loki-compactor-0 Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor success 76m Normal SuccessfulCreate statefulset/logging-loki-compactor create Pod logging-loki-compactor-0 in StatefulSet logging-loki-compactor successful 76m Normal Scheduled pod/logging-loki-distributor-9c6b6d984-s2ztv Successfully assigned openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv to crc 76m Normal SuccessfulCreate replicaset/logging-loki-querier-6dcbdf8bb8 Created pod: logging-loki-querier-6dcbdf8bb8-jgbgh 76m Normal ScalingReplicaSet deployment/logging-loki-distributor Scaled up replica set logging-loki-distributor-9c6b6d984 to 1 76m Normal SuccessfulCreate replicaset/logging-loki-distributor-9c6b6d984 Created pod: logging-loki-distributor-9c6b6d984-s2ztv 76m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-index-gateway-0 waiting for first consumer to be created before binding 76m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-index-gateway-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 76m Normal ExternalProvisioning persistentvolumeclaim/storage-logging-loki-ingester-0 Waiting for a volume to be created either by the external provisioner 'kubevirt.io.hostpath-provisioner' or manually by the system administrator. If volume creation is delayed, please verify that the provisioner is running and correctly registered. 76m Normal WaitForFirstConsumer persistentvolumeclaim/storage-logging-loki-ingester-0 waiting for first consumer to be created before binding 76m Normal SuccessfulCreate replicaset/logging-loki-gateway-b5bdf65c4 Created pod: logging-loki-gateway-b5bdf65c4-ldbjt 76m Normal Pulling pod/logging-loki-querier-6dcbdf8bb8-jgbgh Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 76m Normal ScalingReplicaSet deployment/logging-loki-gateway Scaled up replica set logging-loki-gateway-b5bdf65c4 to 2 76m Normal NoPods poddisruptionbudget/logging-loki-gateway No matching pods found 76m Normal Scheduled pod/logging-loki-index-gateway-0 Successfully assigned openshift-logging/logging-loki-index-gateway-0 to crc 76m Normal Pulling pod/logging-loki-distributor-9c6b6d984-s2ztv Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 76m Normal Scheduled pod/logging-loki-gateway-b5bdf65c4-ldbjt Successfully assigned openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt to crc 76m Normal AddedInterface pod/logging-loki-gateway-b5bdf65c4-ldbjt Add eth0 [10.217.0.54/23] from ovn-kubernetes 76m Normal Pulling pod/logging-loki-gateway-b5bdf65c4-ldbjt Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 76m Normal Scheduled pod/logging-loki-ingester-0 Successfully assigned openshift-logging/logging-loki-ingester-0 to crc 76m Normal SuccessfulCreate replicaset/logging-loki-gateway-b5bdf65c4 Created pod: logging-loki-gateway-b5bdf65c4-vqfpz 76m Normal AddedInterface pod/logging-loki-distributor-9c6b6d984-s2ztv Add eth0 [10.217.0.50/23] from ovn-kubernetes 76m Normal AddedInterface pod/logging-loki-querier-6dcbdf8bb8-jgbgh Add eth0 [10.217.0.51/23] from ovn-kubernetes 76m Normal Scheduled pod/logging-loki-gateway-b5bdf65c4-vqfpz Successfully assigned openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz to crc 76m Normal AddedInterface pod/logging-loki-gateway-b5bdf65c4-vqfpz Add eth0 [10.217.0.53/23] from ovn-kubernetes 76m Normal AddedInterface pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Add eth0 [10.217.0.52/23] from ovn-kubernetes 76m Normal Pulling pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 76m Normal ProvisioningSucceeded persistentvolumeclaim/storage-logging-loki-index-gateway-0 Successfully provisioned volume pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98 76m Normal Scheduled pod/logging-loki-compactor-0 Successfully assigned openshift-logging/logging-loki-compactor-0 to crc 76m Normal AddedInterface pod/logging-loki-compactor-0 Add eth0 [10.217.0.56/23] from ovn-kubernetes 76m Normal Pulling pod/logging-loki-compactor-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 76m Normal Pulling pod/logging-loki-gateway-b5bdf65c4-vqfpz Pulling image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" 76m Normal Pulling pod/logging-loki-ingester-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 76m Normal AddedInterface pod/logging-loki-ingester-0 Add eth0 [10.217.0.55/23] from ovn-kubernetes 76m Normal Pulling pod/logging-loki-index-gateway-0 Pulling image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" 76m Normal AddedInterface pod/logging-loki-index-gateway-0 Add eth0 [10.217.0.57/23] from ovn-kubernetes 75m Normal Created pod/logging-loki-compactor-0 Created container loki-compactor 75m Normal Pulled pod/logging-loki-gateway-b5bdf65c4-ldbjt Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 5.089s (5.089s including waiting). Image size: 181406084 bytes. 75m Normal Created pod/logging-loki-gateway-b5bdf65c4-vqfpz Created container gateway 75m Normal Started pod/logging-loki-gateway-b5bdf65c4-vqfpz Started container gateway 75m Normal Pulling pod/logging-loki-gateway-b5bdf65c4-vqfpz Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 75m Normal Started pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Started container loki-query-frontend 75m Normal Started pod/logging-loki-ingester-0 Started container loki-ingester 75m Normal Created pod/logging-loki-ingester-0 Created container loki-ingester 75m Normal Created pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Created container loki-query-frontend 75m Normal Pulled pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 5.162s (5.162s including waiting). Image size: 225648085 bytes. 75m Normal Pulled pod/logging-loki-ingester-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 4.568s (4.568s including waiting). Image size: 225648085 bytes. 75m Normal Started pod/logging-loki-index-gateway-0 Started container loki-index-gateway 75m Normal Created pod/logging-loki-index-gateway-0 Created container loki-index-gateway 75m Normal Pulled pod/logging-loki-index-gateway-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 4.011s (4.011s including waiting). Image size: 225648085 bytes. 75m Normal Pulled pod/logging-loki-compactor-0 Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 4.104s (4.104s including waiting). Image size: 225648085 bytes. 75m Normal Started pod/logging-loki-compactor-0 Started container loki-compactor 75m Normal Pulled pod/logging-loki-distributor-9c6b6d984-s2ztv Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 5.532s (5.532s including waiting). Image size: 225648085 bytes. 75m Normal Created pod/logging-loki-distributor-9c6b6d984-s2ztv Created container loki-distributor 75m Normal Started pod/logging-loki-distributor-9c6b6d984-s2ztv Started container loki-distributor 75m Normal Pulling pod/logging-loki-gateway-b5bdf65c4-ldbjt Pulling image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" 75m Normal Started pod/logging-loki-gateway-b5bdf65c4-ldbjt Started container gateway 75m Normal Started pod/logging-loki-querier-6dcbdf8bb8-jgbgh Started container loki-querier 75m Normal Created pod/logging-loki-querier-6dcbdf8bb8-jgbgh Created container loki-querier 75m Normal Pulled pod/logging-loki-querier-6dcbdf8bb8-jgbgh Successfully pulled image "registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:ebf28f8c032032395ba9a7aa247f68a25e9484857588e375b4660bc04aa0e0fb" in 5.446s (5.446s including waiting). Image size: 225648085 bytes. 75m Normal Created pod/logging-loki-gateway-b5bdf65c4-ldbjt Created container gateway 75m Normal Pulled pod/logging-loki-gateway-b5bdf65c4-vqfpz Successfully pulled image "registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:f4f8fc520eae06a5c7d8b98ef21bad88253666fc27db9b6bf35b19a63173a167" in 5.118s (5.118s including waiting). Image size: 181406084 bytes. 75m Normal Created pod/logging-loki-gateway-b5bdf65c4-vqfpz Created container opa 75m Normal Created pod/logging-loki-gateway-b5bdf65c4-ldbjt Created container opa 75m Normal Started pod/logging-loki-gateway-b5bdf65c4-ldbjt Started container opa 75m Normal Pulled pod/logging-loki-gateway-b5bdf65c4-ldbjt Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.287s (2.287s including waiting). Image size: 161395402 bytes. 75m Normal Started pod/logging-loki-gateway-b5bdf65c4-vqfpz Started container opa 75m Normal Pulled pod/logging-loki-gateway-b5bdf65c4-vqfpz Successfully pulled image "registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:d385c61e4b289c60cbb712e7c85c4af518e680cf24e9f7fb10acae2b281f08e8" in 2.249s (2.249s including waiting). Image size: 161395402 bytes. 75m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 75m Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: HTTP probe failed with statuscode: 503 75m Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: HTTP probe failed with statuscode: 503... 74m Normal SuccessfulCreate daemonset/collector Created pod: collector-d9cc2 74m Normal Scheduled pod/collector-d9cc2 Successfully assigned openshift-logging/collector-d9cc2 to crc 74m Normal SuccessfulDelete daemonset/collector Deleted pod: collector-d9cc2 74m Warning FailedMount pod/collector-d9cc2 MountVolume.SetUp failed for volume "collector-syslog-receiver" : secret "collector-syslog-receiver" not found 74m Warning FailedMount pod/collector-d9cc2 MountVolume.SetUp failed for volume "metrics" : secret "collector-metrics" not found 74m Normal SuccessfulCreate daemonset/collector Created pod: collector-zx7v8 74m Normal Scheduled pod/collector-zx7v8 Successfully assigned openshift-logging/collector-zx7v8 to crc 74m Normal Pulling pod/collector-zx7v8 Pulling image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" 74m Normal AddedInterface pod/collector-zx7v8 Add eth0 [10.217.0.61/23] from ovn-kubernetes 74m Normal Created pod/collector-zx7v8 Created container collector 74m Normal Pulled pod/collector-zx7v8 Successfully pulled image "registry.redhat.io/openshift-logging/vector-rhel9@sha256:bacedab86fc0ed94a09cfa903e4d88e85b321df18224c43efcba4f6640aa8203" in 5.443s (5.443s including waiting). Image size: 263139944 bytes. 74m Normal Started pod/collector-zx7v8 Started container collector 68m Normal IPAllocated service/openstack-logging Assigned IP ["172.17.0.80"] 68m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/address-pool 68m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/allow-shared-ip 68m Warning deprecatedAnnotation service/openstack-logging Service uses deprecated annotation metallb.universe.tf/loadBalancerIPs 68m Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2" 6m48s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe failed: Get "https://10.217.0.53:8081/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m48s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe error: Get "https://10.217.0.54:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m48s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe failed: Get "https://10.217.0.54:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m48s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe error: Get "https://10.217.0.54:8081/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m48s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe error: Get "https://10.217.0.53:8081/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m48s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe failed: Get "https://10.217.0.54:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m43s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe failed: Get "https://10.217.0.54:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m43s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe error: Get "https://10.217.0.53:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m43s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe error: Get "https://10.217.0.54:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m43s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-s2ztv Readiness probe error: Get "https://10.217.0.50:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m43s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe failed: Get "https://10.217.0.53:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m43s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe failed: Get "https://10.217.0.53:8081/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m43s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe error: Get "https://10.217.0.53:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m43s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-s2ztv Readiness probe failed: Get "https://10.217.0.50:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m42s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Readiness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m42s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Readiness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m42s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-jgbgh Readiness probe error: Get "https://10.217.0.51:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m42s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-jgbgh Readiness probe failed: Get "https://10.217.0.51:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m34s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-s2ztv Readiness probe error: Get "https://10.217.0.50:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m34s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-s2ztv Readiness probe failed: Get "https://10.217.0.50:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m33s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-jgbgh Readiness probe error: Get "https://10.217.0.51:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m33s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe error: Get "https://10.217.0.54:8081/ready": context deadline exceeded... 6m33s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe error: Get "https://10.217.0.53:8081/ready": context deadline exceeded... 6m33s Warning ProbeError pod/logging-loki-distributor-9c6b6d984-s2ztv Liveness probe error: Get "https://10.217.0.50:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m33s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-jgbgh Readiness probe failed: Get "https://10.217.0.51:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m33s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe error: Get "https://10.217.0.54:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m33s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe failed: Get "https://10.217.0.53:8081/ready": context deadline exceeded 6m33s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe failed: Get "https://10.217.0.54:8083/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m33s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe failed: Get "https://10.217.0.53:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m33s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-vqfpz Readiness probe error: Get "https://10.217.0.53:8083/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m33s Warning Unhealthy pod/logging-loki-distributor-9c6b6d984-s2ztv Liveness probe failed: Get "https://10.217.0.50:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m33s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-ldbjt Readiness probe failed: Get "https://10.217.0.54:8081/ready": context deadline exceeded 6m33s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Readiness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m33s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Readiness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m32s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.56:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m32s Warning ProbeError pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Liveness probe error: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m32s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-ldbjt Liveness probe failed: Get "https://10.217.0.54:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m32s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-ldbjt Liveness probe error: Get "https://10.217.0.54:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m32s Warning Unhealthy pod/logging-loki-querier-6dcbdf8bb8-jgbgh Liveness probe failed: Get "https://10.217.0.51:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m32s Warning Unhealthy pod/logging-loki-gateway-b5bdf65c4-vqfpz Liveness probe failed: Get "https://10.217.0.53:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m32s Warning Unhealthy pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 Liveness probe failed: Get "https://10.217.0.52:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) 6m32s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m32s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.55:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m32s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.55:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers)... 6m32s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": context deadline exceeded (Client.Timeout exceeded while awaiting headers) 6m32s Warning ProbeError pod/logging-loki-gateway-b5bdf65c4-vqfpz Liveness probe error: Get "https://10.217.0.53:8083/live": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m32s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.56:3101/ready": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m32s Warning ProbeError pod/logging-loki-querier-6dcbdf8bb8-jgbgh Liveness probe error: Get "https://10.217.0.51:3101/loki/api/v1/status/buildinfo": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)... 6m31s Warning ProbeError pod/logging-loki-index-gateway-0 Readiness probe error: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m31s Warning ProbeError pod/logging-loki-ingester-0 Readiness probe error: Get "https://10.217.0.55:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m31s Warning Unhealthy pod/logging-loki-index-gateway-0 Liveness probe failed: Get "https://10.217.0.57:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning Unhealthy pod/logging-loki-ingester-0 Liveness probe failed: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning ProbeError pod/logging-loki-ingester-0 Liveness probe error: Get "https://10.217.0.55:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m31s Warning ProbeError pod/logging-loki-index-gateway-0 Liveness probe error: Get "https://10.217.0.57:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m31s Warning Unhealthy pod/logging-loki-compactor-0 Liveness probe failed: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning ProbeError pod/logging-loki-compactor-0 Liveness probe error: Get "https://10.217.0.56:3101/loki/api/v1/status/buildinfo": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m31s Warning Unhealthy pod/logging-loki-index-gateway-0 Readiness probe failed: Get "https://10.217.0.57:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning Unhealthy pod/logging-loki-ingester-0 Readiness probe failed: Get "https://10.217.0.55:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning Unhealthy pod/logging-loki-compactor-0 Readiness probe failed: Get "https://10.217.0.56:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers) 6m31s Warning ProbeError pod/logging-loki-compactor-0 Readiness probe error: Get "https://10.217.0.56:3101/ready": net/http: request canceled (Client.Timeout exceeded while awaiting headers)... 6m27s Normal nodeAssigned service/openstack-logging announcing from node "crc" with protocol "layer2"