mirror of
https://github.com/infiniflow/ragflow.git
synced 2025-06-26 22:19:57 +00:00
Editorial updates to Docker README (#3223)
### What problem does this PR solve? ### Type of change - [x] Documentation Update
This commit is contained in:
parent
a418a343d1
commit
af74bf01c0
@ -8,66 +8,75 @@ Look into [.env](./.env), there're some important variables.
|
||||
|
||||
- `STACK_VERSION`
|
||||
The Elasticsearch version. Defaults to `8.11.3`
|
||||
|
||||
- `ES_PORT`
|
||||
Port to expose Elasticsearch HTTP API to the host. Defaults to `1200`.
|
||||
|
||||
- `ELASTIC_PASSWORD`
|
||||
The Elasticsearch password.
|
||||
|
||||
- `MYSQL_PASSWORD`
|
||||
The MySQL password. When updated, you must also revise the `mysql.password` entry in [service_conf.yaml](./service_conf.yaml) accordingly.
|
||||
|
||||
- `MYSQL_PORT`
|
||||
The exported port number of MySQL Docker container, needed when you access the database from outside the docker containers.
|
||||
|
||||
The exported port number of MySQL Docker container, needed when you access the database from outside the Docker container.
|
||||
- `MINIO_USER`
|
||||
The MinIO username. When updated, you must also revise the `minio.user` entry in [service_conf.yaml](./service_conf.yaml) accordingly.
|
||||
|
||||
- `MINIO_PASSWORD`
|
||||
The MinIO password. When updated, you must also revise the `minio.password` entry in [service_conf.yaml](./service_conf.yaml) accordingly.
|
||||
|
||||
|
||||
|
||||
- `SVR_HTTP_PORT`
|
||||
The port number on which RAGFlow's backend API server listens.
|
||||
|
||||
- `TIMEZONE`
|
||||
The local time zone.
|
||||
- `RAGFLOW-IMAGE`
|
||||
The Docker image edition. Available options:
|
||||
- `infiniflow/ragflow:dev-slim` (default): The RAGFlow Docker image without embedding models
|
||||
- `infiniflow/ragflow:dev`: The RAGFlow Docker image with embedding models. See the
|
||||
|
||||
- `TIMEZONE`
|
||||
The local time zone.
|
||||
- `infiniflow/ragflow:dev`: The RAGFlow Docker image with embedding models including:
|
||||
- Embedded embedding models:
|
||||
- `BAAI/bge-large-zh-v1.5`
|
||||
- `BAAI/bge-reranker-v2-m3`
|
||||
- `maidalun1020/bce-embedding-base_v1`
|
||||
- `maidalun1020/bce-reranker-base_v1`
|
||||
- Embedding models that will be downloaded once you select them in the RAGFlow UI:
|
||||
- `BAAI/bge-base-en-v1.5`
|
||||
- `BAAI/bge-large-en-v1.5`
|
||||
- `BAAI/bge-small-en-v1.5`
|
||||
- `BAAI/bge-small-zh-v1.5`
|
||||
- `jinaai/jina-embeddings-v2-base-en`
|
||||
- `jinaai/jina-embeddings-v2-small-en`
|
||||
- `nomic-ai/nomic-embed-text-v1.5`
|
||||
- `sentence-transformers/all-MiniLM-L6-v2`
|
||||
|
||||
|
||||
## Service Configuration
|
||||
|
||||
[service_conf.yaml](./service_conf.yaml) defines the system-level configuration for RAGFlow and is used by RAGFlow's *API server* and *task executor*.
|
||||
[service_conf.yaml](./service_conf.yaml) defines the system-level configuration for RAGFlow and is used by its API server and task executor.
|
||||
|
||||
- `ragflow`
|
||||
|
||||
- `host`: The IP address of the API server.
|
||||
- `port`: The serving port of API server.
|
||||
|
||||
|
||||
- `mysql`
|
||||
- `name`: The database name in MySQL used by RAGFlow.
|
||||
- `user`: The database name in MySQL used by RAGFlow.
|
||||
- `password`: The database password. When updated, you must also revise the `MYSQL_PASSWORD` variable in [.env](./.env) accordingly.
|
||||
- `port`: The serving port of MySQL inside the container. When updated, you must also revise the `MYSQL_PORT` variable in [.env](./.env) accordingly.
|
||||
|
||||
- `name`: The database name in MySQL used by RAGFlow. Defaults to `rag_flow`.
|
||||
- `user`: The MySQL user name.
|
||||
- `password`: The MySQL password. When updated, you must also revise the `MYSQL_PASSWORD` variable in [.env](./.env) accordingly.
|
||||
- `port`: The serving port of MySQL inside the Docker container. When updated, you must also revise the `MYSQL_PORT` variable in [.env](./.env) accordingly.
|
||||
- `max_connections`: The maximum database connection.
|
||||
- `stale_timeout`: The timeout duration in seconds.
|
||||
|
||||
- `stale_timeout`: Timeout in seconds.
|
||||
|
||||
- `minio`
|
||||
|
||||
- `user`: The MinIO username. When updated, you must also revise the `MINIO_USER` variable in [.env](./.env) accordingly.
|
||||
- `password`: The MinIO password. When updated, you must also revise the `MINIO_PASSWORD` variable in [.env](./.env) accordingly.
|
||||
- `host`: The serving IP and port inside the docker container. This is not updating until changing the minio part in [docker-compose.yml](./docker-compose.yml)
|
||||
|
||||
- `user_default_llm`
|
||||
Newly signed-up users use LLM configured by this part; otherwise, you need to configure your own LLM on the *Settings* page.
|
||||
- `host`: The serving IP and port inside the docker container. This is not updated until changing the minio part in [docker-compose.yml](./docker-compose.yml)
|
||||
|
||||
- `user_default_llm`
|
||||
|
||||
The default LLM to use for a new RAGFlow user. It is disabled by default. If you have not set it here, you can configure the default LLM on the **Settings** page in the RAGFlow UI. Newly signed-up users use LLM configured by this part; otherwise, you need to configure your own LLM on the *Settings* page.
|
||||
|
||||
- `factory`: The LLM suppliers. "OpenAI", "Tongyi-Qianwen", "ZHIPU-AI", "Moonshot", "DeepSeek", "Baichuan", and "VolcEngine" are supported.
|
||||
- `api_key`: The API key for the specified LLM.
|
||||
|
||||
|
||||
- `oauth`
|
||||
The OAuth configuration for signing up or signing in to RAGFlow using a third-party account.
|
||||
- `github`: Go to [Github](https://github.com/settings/developers), register a new application, the *client_id* and *secret_key* will be given.
|
||||
The OAuth configuration for signing up or signing in to RAGFlow using a third-party account. It is disabled by default. To enable this feature, uncomment the corresponding lines in **service_conf.yaml**.
|
||||
|
||||
- `github`: The GitHub authentication settings for your application. Visit the [Github Developer Settings page](https://github.com/settings/developers) to obtain your client_id and secret_key.
|
||||
|
||||
|
@ -9,24 +9,7 @@ An API key is required for RAGFlow to interact with an online AI model. This gui
|
||||
|
||||
## Get model API key
|
||||
|
||||
For now, RAGFlow supports the following online LLMs. Click the corresponding link to apply for your model API key. Most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
|
||||
|
||||
- [OpenAI](https://platform.openai.com/login?launch)
|
||||
- [Azure-OpenAI](https://ai.azure.com/)
|
||||
- [Gemini](https://aistudio.google.com/)
|
||||
- [Groq](https://console.groq.com/)
|
||||
- [Mistral](https://mistral.ai/)
|
||||
- [Bedrock](https://aws.amazon.com/cn/bedrock/)
|
||||
- [Tongyi-Qianwen](https://dashscope.console.aliyun.com/model)
|
||||
- [ZHIPU-AI](https://open.bigmodel.cn/)
|
||||
- [MiniMax](https://platform.minimaxi.com/)
|
||||
- [Moonshot](https://platform.moonshot.cn/docs)
|
||||
- [DeepSeek](https://platform.deepseek.com/api-docs/)
|
||||
- [Baichuan](https://www.baichuan-ai.com/home)
|
||||
- [VolcEngine](https://www.volcengine.com/docs/82379)
|
||||
- [Jina](https://jina.ai/reader/)
|
||||
- [OpenRouter](https://openrouter.ai/)
|
||||
- [StepFun](https://platform.stepfun.com/)
|
||||
RAGFlow supports most mainstream LLMs. Please refer to [Supported Models](./references/supported_models.mdx) for a complete list of supported models. You will need to apply for your model API key online. Note that most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
|
||||
|
||||
:::note
|
||||
If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can [file a feature request](https://github.com/infiniflow/ragflow/issues/new?assignees=&labels=feature+request&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+) with us! Alternatively, if you have customized or locally-deployed models, you can [bind them to RAGFlow using Ollama, Xinference, or LocalAI](./deploy_local_llm.mdx).
|
||||
|
@ -49,7 +49,7 @@ You can link your file to one knowledge base or multiple knowledge bases at one
|
||||
|
||||
## Search files or folders
|
||||
|
||||
As of RAGFlow v0.13.0, the search feature is still in a rudimentary form, supporting only file and folder search in the current directory by name (files or folders in the child directory will not be retrieved).
|
||||
**File Management** only supports file name and folder name filtering in the current directory (files or folders in the child directory will not be retrieved).
|
||||
|
||||

|
||||
|
||||
@ -73,7 +73,7 @@ To bulk delete files or folders:
|
||||

|
||||
|
||||
> - You are not allowed to delete the **root/.knowledgebase** folder.
|
||||
> - Deleting files that have been linked to knowledge bases will AUTOMATICALLY REMOVE all associated file references across the knowledge bases.
|
||||
> - Deleting files that have been linked to knowledge bases will **AUTOMATICALLY REMOVE** all associated file references across the knowledge bases.
|
||||
|
||||
## Download uploaded file
|
||||
|
||||
|
@ -223,24 +223,7 @@ With the default settings, you only need to enter `http://IP_OF_YOUR_MACHINE` (*
|
||||
|
||||
## Configure LLMs
|
||||
|
||||
RAGFlow is a RAG engine, and it needs to work with an LLM to offer grounded, hallucination-free question-answering capabilities. For now, RAGFlow supports the following LLMs, and the list is expanding:
|
||||
|
||||
- [OpenAI](https://platform.openai.com/login?launch)
|
||||
- [Azure-OpenAI](https://ai.azure.com/)
|
||||
- [Gemini](https://aistudio.google.com/)
|
||||
- [Groq](https://console.groq.com/)
|
||||
- [Mistral](https://mistral.ai/)
|
||||
- [Bedrock](https://aws.amazon.com/cn/bedrock/)
|
||||
- [Tongyi-Qianwen](https://dashscope.console.aliyun.com/model)
|
||||
- [ZHIPU-AI](https://open.bigmodel.cn/)
|
||||
- [MiniMax](https://platform.minimaxi.com/)
|
||||
- [Moonshot](https://platform.moonshot.cn/docs)
|
||||
- [DeepSeek](https://platform.deepseek.com/api-docs/)
|
||||
- [Baichuan](https://www.baichuan-ai.com/home)
|
||||
- [VolcEngine](https://www.volcengine.com/docs/82379)
|
||||
- [Jina](https://jina.ai/reader/)
|
||||
- [OpenRouter](https://openrouter.ai/)
|
||||
- [StepFun](https://platform.stepfun.com/)
|
||||
RAGFlow is a RAG engine and needs to work with an LLM to offer grounded, hallucination-free question-answering capabilities. RAGFlow supports most mainstream LLMs. For a complete list of supported models, please refer to [Supported Models](./references/supported_models.mdx).
|
||||
|
||||
:::note
|
||||
RAGFlow also supports deploying LLMs locally using Ollama, Xinference, or LocalAI, but this part is not covered in this quick start guide.
|
||||
|
@ -5,6 +5,8 @@ slug: /faq
|
||||
|
||||
# Frequently asked questions
|
||||
|
||||
Queries regarding general usage, troubleshooting, features, performance, and more.
|
||||
|
||||
## General
|
||||
|
||||
### 1. What sets RAGFlow apart from other RAG products?
|
||||
@ -98,7 +100,7 @@ docker build -t infiniflow/ragflow:vX.Y.Z. --network host
|
||||
|
||||
#### 2.1 Cannot access https://huggingface.co
|
||||
|
||||
A *locally* deployed RAGflow downloads OCR and embedding modules from [Huggingface website](https://huggingface.co) by default. If your machine is unable to access this site, the following error occurs and PDF parsing fails:
|
||||
A *locally* deployed RAGflow downloads OCR and embedding modules from [Huggingface website](https://huggingface.co) by default. If your machine is unable to access this site, the following error occurs and PDF parsing fails:
|
||||
|
||||
```
|
||||
FileNotFoundError: [Errno 2] No such file or directory: '/root/.cache/huggingface/hub/models--InfiniFlow--deepdoc/snapshots/be0c1e50eef6047b412d1800aa89aba4d275f997/ocr.res'
|
||||
|
@ -12,7 +12,7 @@ A complete list of models supported by RAGFlow, which will continue to expand.
|
||||
<APITable>
|
||||
```
|
||||
|
||||
| Provider | Chat | Embedding | Rerank | Multi-modal | ASR/STT | TTS |
|
||||
| Provider | Chat | Embedding | Rerank | Multimodal | ASR | TTS |
|
||||
| --------------------- | ------------------ | ------------------ | ------------------ | ------------------ | ------------------ | ------------------ |
|
||||
| Anthropic | :heavy_check_mark: | | | | | |
|
||||
| Azure-OpenAI | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: | :heavy_check_mark: | |
|
||||
@ -20,7 +20,7 @@ A complete list of models supported by RAGFlow, which will continue to expand.
|
||||
| BaiChuan | :heavy_check_mark: | :heavy_check_mark: | | | | |
|
||||
| BaiduYiyan | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | |
|
||||
| Bedrock | :heavy_check_mark: | :heavy_check_mark: | | | | |
|
||||
| cohere | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | |
|
||||
| Cohere | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | |
|
||||
| DeepSeek | :heavy_check_mark: | | | | | |
|
||||
| FastEmbed | | :heavy_check_mark: | | | | |
|
||||
| Fish Audio | | | | | | :heavy_check_mark: |
|
||||
@ -62,5 +62,6 @@ A complete list of models supported by RAGFlow, which will continue to expand.
|
||||
</APITable>
|
||||
```
|
||||
|
||||
|
||||
|
||||
:::note
|
||||
The list of supported models is extracted from [this source](https://github.com/infiniflow/ragflow/blob/main/rag/llm/__init__.py) and may not be the most current. For the latest supported model list, please refer to the Python file.
|
||||
:::
|
Loading…
x
Reference in New Issue
Block a user