Fix errors (#11795)

### What problem does this PR solve?

- typos
- IDE warnings

### Type of change

- [x] Refactoring

---------

Signed-off-by: Jin Hai <haijin.chn@gmail.com>
This commit is contained in:
Jin Hai
2025-12-08 09:42:10 +08:00
committed by GitHub
parent 8de6b97806
commit 6546f86b4e
13 changed files with 35 additions and 35 deletions

View File

@ -305,7 +305,7 @@ With the Ollama service running, open a new terminal and run `./ollama pull <mod
</TabItem>
</Tabs>
### 4. Configure RAGflow
### 4. Configure RAGFlow
To enable IPEX-LLM accelerated Ollama in RAGFlow, you must also complete the configurations in RAGFlow. The steps are identical to those outlined in the *Deploy a local model using Ollama* section: