diff --git a/docs/guides/models/deploy_local_llm.mdx b/docs/guides/models/deploy_local_llm.mdx index c47122000..7d8e58eee 100644 --- a/docs/guides/models/deploy_local_llm.mdx +++ b/docs/guides/models/deploy_local_llm.mdx @@ -340,13 +340,13 @@ Application startup complete. setting->model providers->search->vllm->add ,configure as follow: -![add vllm](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm.png) +![add vllm](https://raw.githubusercontent.com/infiniflow/ragflow-docs/main/images/ragflow_vllm.png) select vllm chat model as default llm model as follow: -![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png) +![chat](https://raw.githubusercontent.com/infiniflow/ragflow-docs/main/images/ragflow_vllm1.png) ### 5.3 chat with vllm chat model create chat->create conversations-chat as follow: -![chat](https://github.com/infiniflow/ragflow-docs/blob/main/images/ragflow_vllm1.png) +![chat](https://raw.githubusercontent.com/infiniflow/ragflow-docs/main/images/ragflow_vllm2.png)