From 45b96acf6b04804e8151123dcfbcee8ee6f033e2 Mon Sep 17 00:00:00 2001 From: yiminghub2024 <482890@qq.com> Date: Sun, 28 Dec 2025 19:29:33 +0800 Subject: [PATCH] Update deploy_local_llm.mdx vllm guide picture (#12275) ### Type of change - [x] Documentation Update --- docs/guides/models/deploy_local_llm.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/guides/models/deploy_local_llm.mdx b/docs/guides/models/deploy_local_llm.mdx index 32fa224b1..c47122000 100644 --- a/docs/guides/models/deploy_local_llm.mdx +++ b/docs/guides/models/deploy_local_llm.mdx @@ -340,13 +340,13 @@ Application startup complete. setting->model providers->search->vllm->add ,configure as follow: -![add vllm](https://github.com/user-attachments/assets/6f1d9f1a-3507-465b-87a3-4427254fff86) +![add vllm](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm.png) select vllm chat model as default llm model as follow: -![chat](https://github.com/user-attachments/assets/05efbd4b-2c18-4c6b-8d1c-52bae712372d) +![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png) ### 5.3 chat with vllm chat model create chat->create conversations-chat as follow: -![chat](https://github.com/user-attachments/assets/dc1885f6-23a9-48f1-8850-d5f59b5e8f67) +![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png)