mirror of
https://github.com/infiniflow/ragflow.git
synced 2025-12-08 20:42:30 +08:00
Don't release full image (#10654)
### What problem does this PR solve? Introduced gpu profile in .env Added Dockerfile_tei fix datrie Removed LIGHTEN flag ### Type of change - [x] Documentation Update - [x] Refactoring
This commit is contained in:
@ -62,7 +62,7 @@ $ sudo docker exec ollama ollama pull bge-m3
|
||||
|
||||
- If RAGFlow runs in Docker, the localhost is mapped within the RAGFlow Docker container as `host.docker.internal`. If Ollama runs on the same host machine, the right URL to use for Ollama would be `http://host.docker.internal:11434/' and you should check that Ollama is accessible from inside the RAGFlow container with:
|
||||
```bash
|
||||
$ sudo docker exec -it ragflow-server bash
|
||||
$ sudo docker exec -it docker-ragflow-cpu-1 bash
|
||||
$ curl http://host.docker.internal:11434/
|
||||
> Ollama is running
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user