Editorial updates to Docker README (#3223)

### What problem does this PR solve?



### Type of change


- [x] Documentation Update
This commit is contained in:
writinwaters
2024-11-06 09:43:54 +08:00
committed by GitHub
parent a418a343d1
commit af74bf01c0
6 changed files with 51 additions and 73 deletions

View File

@ -9,24 +9,7 @@ An API key is required for RAGFlow to interact with an online AI model. This gui
## Get model API key
For now, RAGFlow supports the following online LLMs. Click the corresponding link to apply for your model API key. Most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
- [OpenAI](https://platform.openai.com/login?launch)
- [Azure-OpenAI](https://ai.azure.com/)
- [Gemini](https://aistudio.google.com/)
- [Groq](https://console.groq.com/)
- [Mistral](https://mistral.ai/)
- [Bedrock](https://aws.amazon.com/cn/bedrock/)
- [Tongyi-Qianwen](https://dashscope.console.aliyun.com/model)
- [ZHIPU-AI](https://open.bigmodel.cn/)
- [MiniMax](https://platform.minimaxi.com/)
- [Moonshot](https://platform.moonshot.cn/docs)
- [DeepSeek](https://platform.deepseek.com/api-docs/)
- [Baichuan](https://www.baichuan-ai.com/home)
- [VolcEngine](https://www.volcengine.com/docs/82379)
- [Jina](https://jina.ai/reader/)
- [OpenRouter](https://openrouter.ai/)
- [StepFun](https://platform.stepfun.com/)
RAGFlow supports most mainstream LLMs. Please refer to [Supported Models](./references/supported_models.mdx) for a complete list of supported models. You will need to apply for your model API key online. Note that most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
:::note
If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can [file a feature request](https://github.com/infiniflow/ragflow/issues/new?assignees=&labels=feature+request&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+) with us! Alternatively, if you have customized or locally-deployed models, you can [bind them to RAGFlow using Ollama, Xinference, or LocalAI](./deploy_local_llm.mdx).

View File

@ -49,7 +49,7 @@ You can link your file to one knowledge base or multiple knowledge bases at one
## Search files or folders
As of RAGFlow v0.13.0, the search feature is still in a rudimentary form, supporting only file and folder search in the current directory by name (files or folders in the child directory will not be retrieved).
**File Management** only supports file name and folder name filtering in the current directory (files or folders in the child directory will not be retrieved).
![search file](https://github.com/infiniflow/ragflow/assets/93570324/77ffc2e5-bd80-4ed1-841f-068e664efffe)
@ -73,7 +73,7 @@ To bulk delete files or folders:
![bulk delete](https://github.com/infiniflow/ragflow/assets/93570324/519b99ab-ec7f-4c8a-8cea-e0b6dcb3cb46)
> - You are not allowed to delete the **root/.knowledgebase** folder.
> - Deleting files that have been linked to knowledge bases will AUTOMATICALLY REMOVE all associated file references across the knowledge bases.
> - Deleting files that have been linked to knowledge bases will **AUTOMATICALLY REMOVE** all associated file references across the knowledge bases.
## Download uploaded file

View File

@ -223,24 +223,7 @@ With the default settings, you only need to enter `http://IP_OF_YOUR_MACHINE` (*
## Configure LLMs
RAGFlow is a RAG engine, and it needs to work with an LLM to offer grounded, hallucination-free question-answering capabilities. For now, RAGFlow supports the following LLMs, and the list is expanding:
- [OpenAI](https://platform.openai.com/login?launch)
- [Azure-OpenAI](https://ai.azure.com/)
- [Gemini](https://aistudio.google.com/)
- [Groq](https://console.groq.com/)
- [Mistral](https://mistral.ai/)
- [Bedrock](https://aws.amazon.com/cn/bedrock/)
- [Tongyi-Qianwen](https://dashscope.console.aliyun.com/model)
- [ZHIPU-AI](https://open.bigmodel.cn/)
- [MiniMax](https://platform.minimaxi.com/)
- [Moonshot](https://platform.moonshot.cn/docs)
- [DeepSeek](https://platform.deepseek.com/api-docs/)
- [Baichuan](https://www.baichuan-ai.com/home)
- [VolcEngine](https://www.volcengine.com/docs/82379)
- [Jina](https://jina.ai/reader/)
- [OpenRouter](https://openrouter.ai/)
- [StepFun](https://platform.stepfun.com/)
RAGFlow is a RAG engine and needs to work with an LLM to offer grounded, hallucination-free question-answering capabilities. RAGFlow supports most mainstream LLMs. For a complete list of supported models, please refer to [Supported Models](./references/supported_models.mdx).
:::note
RAGFlow also supports deploying LLMs locally using Ollama, Xinference, or LocalAI, but this part is not covered in this quick start guide.

View File

@ -5,6 +5,8 @@ slug: /faq
# Frequently asked questions
Queries regarding general usage, troubleshooting, features, performance, and more.
## General
### 1. What sets RAGFlow apart from other RAG products?
@ -98,7 +100,7 @@ docker build -t infiniflow/ragflow:vX.Y.Z. --network host
#### 2.1 Cannot access https://huggingface.co
A *locally* deployed RAGflow downloads OCR and embedding modules from [Huggingface website](https://huggingface.co) by default. If your machine is unable to access this site, the following error occurs and PDF parsing fails:
A *locally* deployed RAGflow downloads OCR and embedding modules from [Huggingface website](https://huggingface.co) by default. If your machine is unable to access this site, the following error occurs and PDF parsing fails:
```
FileNotFoundError: [Errno 2] No such file or directory: '/root/.cache/huggingface/hub/models--InfiniFlow--deepdoc/snapshots/be0c1e50eef6047b412d1800aa89aba4d275f997/ocr.res'

View File

@ -12,7 +12,7 @@ A complete list of models supported by RAGFlow, which will continue to expand.
<APITable>
```
| Provider | Chat | Embedding | Rerank | Multi-modal | ASR/STT | TTS |
| Provider | Chat | Embedding | Rerank | Multimodal | ASR | TTS |
| --------------------- | ------------------ | ------------------ | ------------------ | ------------------ | ------------------ | ------------------ |
| Anthropic | :heavy_check_mark: | | | | | |
| Azure-OpenAI | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: | :heavy_check_mark: | |
@ -20,7 +20,7 @@ A complete list of models supported by RAGFlow, which will continue to expand.
| BaiChuan | :heavy_check_mark: | :heavy_check_mark: | | | | |
| BaiduYiyan | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | |
| Bedrock | :heavy_check_mark: | :heavy_check_mark: | | | | |
| cohere | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | |
| Cohere | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | |
| DeepSeek | :heavy_check_mark: | | | | | |
| FastEmbed | | :heavy_check_mark: | | | | |
| Fish Audio | | | | | | :heavy_check_mark: |
@ -62,5 +62,6 @@ A complete list of models supported by RAGFlow, which will continue to expand.
</APITable>
```
:::note
The list of supported models is extracted from [this source](https://github.com/infiniflow/ragflow/blob/main/rag/llm/__init__.py) and may not be the most current. For the latest supported model list, please refer to the Python file.
:::