mirror of
https://github.com/infiniflow/ragflow.git
synced 2025-12-08 20:42:30 +08:00
Fix typo and error (#5479)
### What problem does this PR solve? _Briefly describe what this PR aims to solve. Include background context that will help reviewers understand the purpose of the PR._ ### Type of change - [x] Bug Fix (non-breaking change which fixes an issue) - [x] Documentation Update --------- Signed-off-by: Jin Hai <haijin.chn@gmail.com>
This commit is contained in:
@ -5,7 +5,7 @@ slug: /text2sql_agent
|
||||
|
||||
# Create a Text2SQL agent
|
||||
|
||||
Build a Text2SQL agent leverging RAGFlow's RAG capabilities. Contributed by @TeslaZY.
|
||||
Build a Text2SQL agent leveraging RAGFlow's RAG capabilities. Contributed by @TeslaZY.
|
||||
|
||||
## Scenario
|
||||
|
||||
@ -343,7 +343,7 @@ Synonyms: laptop computer,laptop pc
|
||||
3. Create a Retrieval node and name it Thesaurus; create an ExeSQL node.
|
||||
4. Configure the Q->SQL, DDL, DB_Description, and TextSQL_Thesaurus knowledge bases. Please refer to the following:
|
||||

|
||||
5. Configure the Generate node, named LLM‘s prompt:
|
||||
5. Configure the Generate node, named LLM's prompt:
|
||||
- Add this content to the prompt provided by the template to provide the thesaurus content to the LLM:
|
||||
```plaintext
|
||||
## You may use the following Thesaurus statements. For example, what I ask is from Synonyms, you must use Standard noun to generate SQL. Use responses to past questions also to guide you: {sql_thesaurus}.
|
||||
|
||||
@ -25,7 +25,7 @@ Knowledge graphs are especially useful for multi-hop question-answering involvin
|
||||
|
||||
## Prerequisites
|
||||
|
||||
The system's default chat model is used to generate knowledge graph. Before proceeding, ensure that you have an chat model properly configured:
|
||||
The system's default chat model is used to generate knowledge graph. Before proceeding, ensure that you have a chat model properly configured:
|
||||
|
||||

|
||||
|
||||
@ -60,10 +60,10 @@ In a knowledge graph, a community is a cluster of entities linked by relationshi
|
||||
|
||||
1. On the **Configuration** page of your knowledge base, switch on **Extract knowledge graph** or adjust its settings as needed, and click **Save** to confirm your changes.
|
||||
|
||||
- *The default knowledge graph configurations for your knowlege base are now set and files uploaded from this point onward will automatically use these settings during parsing.*
|
||||
- *The default knowledge graph configurations for your knowledge base are now set and files uploaded from this point onward will automatically use these settings during parsing.*
|
||||
- *Files parsed before this update will retain their original knowledge graph settings.*
|
||||
|
||||
2. The knowledge graph of your knowlege base does *not* automatically update *until* a newly uploaded file is parsed.
|
||||
2. The knowledge graph of your knowledge base does *not* automatically update *until* a newly uploaded file is parsed.
|
||||
|
||||
_A **Knowledge graph** entry appears under **Configuration** once a knowledge graph is created._
|
||||
|
||||
|
||||
@ -59,20 +59,20 @@ success
|
||||
|
||||
### 2. Ensure Ollama is accessible
|
||||
|
||||
If RAGFlow runs in Docker and Ollama runs on the same host machine, check if ollama is accessiable from inside the RAGFlow container:
|
||||
If RAGFlow runs in Docker and Ollama runs on the same host machine, check if ollama is accessible from inside the RAGFlow container:
|
||||
```bash
|
||||
sudo docker exec -it ragflow-server bash
|
||||
root@8136b8c3e914:/ragflow# curl http://host.docker.internal:11434/
|
||||
Ollama is running
|
||||
```
|
||||
|
||||
If RAGFlow runs from source code and Ollama runs on the same host machine, check if ollama is accessiable from RAGFlow host machine:
|
||||
If RAGFlow runs from source code and Ollama runs on the same host machine, check if ollama is accessible from RAGFlow host machine:
|
||||
```bash
|
||||
curl http://localhost:11434/
|
||||
Ollama is running
|
||||
```
|
||||
|
||||
If RAGFlow and Ollama run on different machines, check if ollama is accessiable from RAGFlow host machine:
|
||||
If RAGFlow and Ollama run on different machines, check if ollama is accessible from RAGFlow host machine:
|
||||
```bash
|
||||
curl http://${IP_OF_OLLAMA_MACHINE}:11434/
|
||||
Ollama is running
|
||||
|
||||
@ -22,7 +22,7 @@ The "garbage in garbage out" status quo remains unchanged despite the fact that
|
||||
|
||||
---
|
||||
|
||||
### Where to find the version of RAGFlow? How to interprete it?
|
||||
### Where to find the version of RAGFlow? How to interpret it?
|
||||
|
||||
You can find the RAGFlow version number on the **System** page of the UI:
|
||||
|
||||
@ -345,13 +345,13 @@ Your IP address or port number may be incorrect. If you are using the default co
|
||||
A correct Ollama IP address and port is crucial to adding models to Ollama:
|
||||
|
||||
- If you are on demo.ragflow.io, ensure that the server hosting Ollama has a publicly accessible IP address. Note that 127.0.0.1 is not a publicly accessible IP address.
|
||||
- If you deploy RAGFlow locally, ensure that Ollama and RAGFlow are in the same LAN and can comunicate with each other.
|
||||
- If you deploy RAGFlow locally, ensure that Ollama and RAGFlow are in the same LAN and can communicate with each other.
|
||||
|
||||
See [Deploy a local LLM](../guides/deploy_local_llm.mdx) for more information.
|
||||
|
||||
---
|
||||
|
||||
#### Do you offer examples of using deepdoc to parse PDF or other files?
|
||||
#### Do you offer examples of using DeepDoc to parse PDF or other files?
|
||||
|
||||
Yes, we do. See the Python files under the **rag/app** folder.
|
||||
|
||||
|
||||
@ -42,7 +42,7 @@ Whether to receive the response as a stream. Set this to `false` explicitly if y
|
||||
|
||||
#### Returns
|
||||
|
||||
- Success: Respose [message](https://platform.openai.com/docs/api-reference/chat/create) like OpenAI
|
||||
- Success: Response [message](https://platform.openai.com/docs/api-reference/chat/create) like OpenAI
|
||||
- Failure: `Exception`
|
||||
|
||||
#### Examples
|
||||
|
||||
Reference in New Issue
Block a user