diff --git a/docs/guides/models/deploy_local_llm.mdx b/docs/guides/models/deploy_local_llm.mdx index 077cd10be..32fa224b1 100644 --- a/docs/guides/models/deploy_local_llm.mdx +++ b/docs/guides/models/deploy_local_llm.mdx @@ -338,14 +338,14 @@ Application startup complete. ``` ### 5.2 INTERGRATEING RAGFLOW WITH VLLM CHAT/EM/RERANK LLM WITH WEBUI -setting->model providers->search->vllm->add ,configure as follow:
+setting->model providers->search->vllm->add ,configure as follow: ![add vllm](https://github.com/user-attachments/assets/6f1d9f1a-3507-465b-87a3-4427254fff86) -select vllm chat model as default llm model as follow:
+select vllm chat model as default llm model as follow: ![chat](https://github.com/user-attachments/assets/05efbd4b-2c18-4c6b-8d1c-52bae712372d) ### 5.3 chat with vllm chat model -create chat->create conversations-chat as follow:
+create chat->create conversations-chat as follow: ![chat](https://github.com/user-attachments/assets/dc1885f6-23a9-48f1-8850-d5f59b5e8f67)