Update deploy_local_llm.mdx vllm guide picture (#12275)

### Type of change
- [x] Documentation Update
This commit is contained in:
yiminghub2024
2025-12-28 19:29:33 +08:00
committed by GitHub
parent 3305215144
commit 45b96acf6b

View File

@ -340,13 +340,13 @@ Application startup complete.
setting->model providers->search->vllm->add ,configure as follow: setting->model providers->search->vllm->add ,configure as follow:
![add vllm](https://github.com/user-attachments/assets/6f1d9f1a-3507-465b-87a3-4427254fff86) ![add vllm](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm.png)
select vllm chat model as default llm model as follow: select vllm chat model as default llm model as follow:
![chat](https://github.com/user-attachments/assets/05efbd4b-2c18-4c6b-8d1c-52bae712372d) ![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png)
### 5.3 chat with vllm chat model ### 5.3 chat with vllm chat model
create chat->create conversations-chat as follow: create chat->create conversations-chat as follow:
![chat](https://github.com/user-attachments/assets/dc1885f6-23a9-48f1-8850-d5f59b5e8f67) ![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png)