From 3fe94d3386f06ae988387e43bdecf62d8cd7d410 Mon Sep 17 00:00:00 2001 From: writinwaters <93570324+writinwaters@users.noreply.github.com> Date: Fri, 26 Dec 2025 21:33:55 +0800 Subject: [PATCH] Docs: Fixed a display issue (#12259) ### Type of change - [x] Documentation Update --- docs/guides/models/deploy_local_llm.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/guides/models/deploy_local_llm.mdx b/docs/guides/models/deploy_local_llm.mdx index 077cd10be..32fa224b1 100644 --- a/docs/guides/models/deploy_local_llm.mdx +++ b/docs/guides/models/deploy_local_llm.mdx @@ -338,14 +338,14 @@ Application startup complete. ``` ### 5.2 INTERGRATEING RAGFLOW WITH VLLM CHAT/EM/RERANK LLM WITH WEBUI -setting->model providers->search->vllm->add ,configure as follow:
+setting->model providers->search->vllm->add ,configure as follow: ![add vllm](https://github.com/user-attachments/assets/6f1d9f1a-3507-465b-87a3-4427254fff86) -select vllm chat model as default llm model as follow:
+select vllm chat model as default llm model as follow: ![chat](https://github.com/user-attachments/assets/05efbd4b-2c18-4c6b-8d1c-52bae712372d) ### 5.3 chat with vllm chat model -create chat->create conversations-chat as follow:
+create chat->create conversations-chat as follow: ![chat](https://github.com/user-attachments/assets/dc1885f6-23a9-48f1-8850-d5f59b5e8f67)