Update documentation in README files

This commit is contained in:
yangdx 2025-08-17 02:23:14 +08:00
parent 1ed77a2e53
commit da7e4b79e5
3 changed files with 7 additions and 3 deletions

View File

@ -123,7 +123,7 @@ MAX_PARALLEL_INSERT=2
###########################################################
### LLM Configuration
### LLM_BINDING type: openai, ollama, lollms, azure_openai
### LLM_BINDING type: openai, ollama, lollms, azure_openai, aws_bedrock
###########################################################
### LLM temperature setting for all llm binding (openai, azure_openai, ollama)
# TEMPERATURE=1.0

View File

@ -40,6 +40,7 @@ LightRAG 需要同时集成 LLM大型语言模型和嵌入模型以有效
* lollms
* openai 或 openai 兼容
* azure_openai
* aws_bedrock
建议使用环境变量来配置 LightRAG 服务器。项目根目录中有一个名为 `env.example` 的示例环境变量文件。请将此文件复制到启动目录并重命名为 `.env`。之后,您可以在 `.env` 文件中修改与 LLM 和嵌入模型相关的参数。需要注意的是LightRAG 服务器每次启动时都会将 `.env` 中的环境变量加载到系统环境变量中。**LightRAG 服务器会优先使用系统环境变量中的设置**。
@ -357,6 +358,7 @@ LightRAG 支持绑定到各种 LLM/嵌入后端:
* openai 和 openai 兼容
* azure_openai
* lollms
* aws_bedrock
使用环境变量 `LLM_BINDING` 或 CLI 参数 `--llm-binding` 选择 LLM 后端类型。使用环境变量 `EMBEDDING_BINDING` 或 CLI 参数 `--embedding-binding` 选择嵌入后端类型。

View File

@ -40,6 +40,7 @@ LightRAG necessitates the integration of both an LLM (Large Language Model) and
* lollms
* openai or openai compatible
* azure_openai
* aws_bedrock
It is recommended to use environment variables to configure the LightRAG Server. There is an example environment variable file named `env.example` in the root directory of the project. Please copy this file to the startup directory and rename it to `.env`. After that, you can modify the parameters related to the LLM and Embedding models in the `.env` file. It is important to note that the LightRAG Server will load the environment variables from `.env` into the system environment variables each time it starts. **LightRAG Server will prioritize the settings in the system environment variables to .env file**.
@ -360,6 +361,7 @@ LightRAG supports binding to various LLM/Embedding backends:
* openai & openai compatible
* azure_openai
* lollms
* aws_bedrock
Use environment variables `LLM_BINDING` or CLI argument `--llm-binding` to select the LLM backend type. Use environment variables `EMBEDDING_BINDING` or CLI argument `--embedding-binding` to select the Embedding backend type.
@ -459,8 +461,8 @@ You cannot change storage implementation selection after adding documents to Lig
| --ssl-keyfile | None | Path to SSL private key file (required if --ssl is enabled) |
| --top-k | 50 | Number of top-k items to retrieve; corresponds to entities in "local" mode and relationships in "global" mode. |
| --cosine-threshold | 0.4 | The cosine threshold for nodes and relation retrieval, works with top-k to control the retrieval of nodes and relations. |
| --llm-binding | ollama | LLM binding type (lollms, ollama, openai, openai-ollama, azure_openai) |
| --embedding-binding | ollama | Embedding binding type (lollms, ollama, openai, azure_openai) |
| --llm-binding | ollama | LLM binding type (lollms, ollama, openai, openai-ollama, azure_openai, aws_bedrock) |
| --embedding-binding | ollama | Embedding binding type (lollms, ollama, openai, azure_openai, aws_bedrock) |
| --auto-scan-at-startup| - | Scan input directory for new files and start indexing |
### Additional Ollama Binding Options