mirror of
https://github.com/infiniflow/ragflow.git
synced 2025-12-08 20:42:30 +08:00
Fixed a broken link (#2190)
To fix a broken link ### Type of change - [x] Documentation Update
This commit is contained in:
@ -370,7 +370,7 @@ You limit what the system responds to what you specify in **Empty response** if
|
||||
|
||||
### 4. How to run RAGFlow with a locally deployed LLM?
|
||||
|
||||
You can use Ollama to deploy local LLM. See [here](https://github.com/infiniflow/ragflow/blob/main/docs/guides/deploy_local_llm.md) for more information.
|
||||
You can use Ollama to deploy local LLM. See [here](../guides/deploy_local_llm.mdx) for more information.
|
||||
|
||||
### 5. How to link up ragflow and ollama servers?
|
||||
|
||||
|
||||
Reference in New Issue
Block a user