LightRAG/examples/openai_README.md

2.9 KiB

API Server Implementation

LightRAG also provides a FastAPI-based server implementation for RESTful API access to RAG operations. This allows you to run LightRAG as a service and interact with it through HTTP requests.

Setting up the API Server

Click to expand setup instructions
  1. First, ensure you have the required dependencies:
pip install fastapi uvicorn pydantic
  1. Set up your environment variables:
export RAG_DIR="your_index_directory"  # Optional: Defaults to "index_default"
export OPENAI_BASE_URL="Your OpenAI API base URL"  # Optional: Defaults to "https://api.openai.com/v1"
export OPENAI_API_KEY="Your OpenAI API key"  # Required
export LLM_MODEL="Your LLM model" # Optional: Defaults to "gpt-4o-mini"
export EMBEDDING_MODEL="Your embedding model" # Optional: Defaults to "text-embedding-3-large"
  1. Run the API server:
python examples/lightrag_api_openai_compatible_demo.py

The server will start on http://0.0.0.0:8020.

API Endpoints

The API server provides the following endpoints:

1. Query Endpoint

Click to view Query endpoint details
  • URL: /query
  • Method: POST
  • Body:
{
    "query": "Your question here",
    "mode": "hybrid",  // Can be "naive", "local", "global", or "hybrid"
    "only_need_context": true // Optional: Defaults to false, if true, only the referenced context will be returned, otherwise the llm answer will be returned
}
  • Example:
curl -X POST "http://127.0.0.1:8020/query" \
     -H "Content-Type: application/json" \
     -d '{"query": "What are the main themes?", "mode": "hybrid"}'

2. Insert Text Endpoint

Click to view Insert Text endpoint details
  • URL: /insert
  • Method: POST
  • Body:
{
    "text": "Your text content here"
}
  • Example:
curl -X POST "http://127.0.0.1:8020/insert" \
     -H "Content-Type: application/json" \
     -d '{"text": "Content to be inserted into RAG"}'

3. Insert File Endpoint

Click to view Insert File endpoint details
  • URL: /insert_file
  • Method: POST
  • Body:
{
    "file_path": "path/to/your/file.txt"
}
  • Example:
curl -X POST "http://127.0.0.1:8020/insert_file" \
     -H "Content-Type: application/json" \
     -d '{"file_path": "./book.txt"}'

4. Health Check Endpoint

Click to view Health Check endpoint details
  • URL: /health
  • Method: GET
  • Example:
curl -X GET "http://127.0.0.1:8020/health"

Configuration

The API server can be configured using environment variables:

  • RAG_DIR: Directory for storing the RAG index (default: "index_default")
  • API keys and base URLs should be configured in the code for your specific LLM and embedding model providers