From 3fe94d3386f06ae988387e43bdecf62d8cd7d410 Mon Sep 17 00:00:00 2001
From: writinwaters <93570324+writinwaters@users.noreply.github.com>
Date: Fri, 26 Dec 2025 21:33:55 +0800
Subject: [PATCH] Docs: Fixed a display issue (#12259)
### Type of change
- [x] Documentation Update
---
docs/guides/models/deploy_local_llm.mdx | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/docs/guides/models/deploy_local_llm.mdx b/docs/guides/models/deploy_local_llm.mdx
index 077cd10be..32fa224b1 100644
--- a/docs/guides/models/deploy_local_llm.mdx
+++ b/docs/guides/models/deploy_local_llm.mdx
@@ -338,14 +338,14 @@ Application startup complete.
```
### 5.2 INTERGRATEING RAGFLOW WITH VLLM CHAT/EM/RERANK LLM WITH WEBUI
-setting->model providers->search->vllm->add ,configure as follow:
+setting->model providers->search->vllm->add ,configure as follow:

-select vllm chat model as default llm model as follow:
+select vllm chat model as default llm model as follow:

### 5.3 chat with vllm chat model
-create chat->create conversations-chat as follow:
+create chat->create conversations-chat as follow:
