mirror of
https://github.com/infiniflow/ragflow.git
synced 2025-12-08 20:42:30 +08:00
Fix errors (#11795)
### What problem does this PR solve? - typos - IDE warnings ### Type of change - [x] Refactoring --------- Signed-off-by: Jin Hai <haijin.chn@gmail.com>
This commit is contained in:
@ -305,7 +305,7 @@ With the Ollama service running, open a new terminal and run `./ollama pull <mod
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
### 4. Configure RAGflow
|
||||
### 4. Configure RAGFlow
|
||||
|
||||
To enable IPEX-LLM accelerated Ollama in RAGFlow, you must also complete the configurations in RAGFlow. The steps are identical to those outlined in the *Deploy a local model using Ollama* section:
|
||||
|
||||
|
||||
Reference in New Issue
Block a user