Update deploy_local_llm.mdx (#12276)

### Type of change

- [x] Documentation Update
This commit is contained in:
Yingfeng
2025-12-28 19:46:50 +08:00
committed by GitHub
parent 45b96acf6b
commit 2114b9e3ad

View File

@ -340,13 +340,13 @@ Application startup complete.
setting->model providers->search->vllm->add ,configure as follow:
![add vllm](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm.png)
![add vllm](https://raw.githubusercontent.com/infiniflow/ragflow-docs/main/images/ragflow_vllm.png)
select vllm chat model as default llm model as follow:
![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png)
![chat](https://raw.githubusercontent.com/infiniflow/ragflow-docs/main/images/ragflow_vllm1.png)
### 5.3 chat with vllm chat model
create chat->create conversations-chat as follow:
![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png)
![chat](https://raw.githubusercontent.com/infiniflow/ragflow-docs/main/images/ragflow_vllm2.png)