41 Commits

Author SHA1 Message Date
zrguo
e17e61f58e fix lint 2025-04-03 14:44:56 +08:00
zrguo
9648300b18
Merge pull request #1208 from shane-lil/openai-client-config
feat(openai): add client configuration support to OpenAI integration
2025-04-03 17:43:57 +11:00
yangdx
80335d57a5 Fix linting 2025-03-28 21:43:47 +08:00
yangdx
491c78dac1 Improve OpenAI LLM logging with more detailed debug information 2025-03-28 21:33:59 +08:00
Shane Walker
d45dc14069
feat(openai): add client configuration support to OpenAI integration
Add support for custom client configurations in the OpenAI integration,
allowing for more flexible configuration of the AsyncOpenAI client.
This includes:

- Create a reusable helper function `create_openai_async_client`
- Add proper documentation for client configuration options
- Ensure consistent parameter precedence across the codebase
- Update the embedding function to support client configurations
- Add example script demonstrating custom client configuration usage

The changes maintain backward compatibility while providing a cleaner
and more maintainable approach to configuring OpenAI clients.
2025-03-27 15:39:39 -07:00
choizhang
8488229a29 feat: Add TokenTracker to track token usage for LLM calls 2025-03-28 01:25:15 +08:00
yangdx
dc99b714ba Update webui assets 2025-03-22 00:36:38 +08:00
Saifeddine ALOUI
3f8043ba43
Merge branch 'HKUDS:main' into main 2025-03-21 14:20:51 +01:00
Mario Vignieri
f33bcbb32c fix hf_embed torch device use MPS or CPU when CUDA is not available -macos users 2025-03-20 09:40:56 +01:00
Saifeddine ALOUI
c7d76b4cee
Create anthropic.py 2025-03-17 10:21:01 +01:00
zrguo
adba09f6c2 fix stream 2025-03-17 11:41:55 +08:00
zrguo
f5ab76dc4c fix linting 2025-03-14 14:10:59 +08:00
lvyb
87474f7b2c fix stream 2025-03-12 16:57:51 +08:00
Zhichun Wu
d77401961d
Resolve the issue with making API calls to Azure OpenAI service 2025-03-11 11:57:41 +08:00
Konrad Wojciechowski
4f76b1c23e fix AttributeError: 'NoneType' object has no attribute 'dim' 2025-02-24 10:28:15 +01:00
Pankaj Kaushal
6f09bfc970 Update LlamaIndex README: improve documentation and example paths
- Updated file paths for LlamaIndex examples
- Simplified README structure
- Corrected import statements to reflect new directory layout
- Removed outdated wrapper directory references
2025-02-20 10:33:15 +01:00
Pankaj Kaushal
173a806b9a Moved back to llm dir as per
https://github.com/HKUDS/LightRAG/pull/864#issuecomment-2669705946

- Created two new example scripts demonstrating LightRAG integration with LlamaIndex:
  - `lightrag_llamaindex_direct_demo.py`: Direct OpenAI integration
  - `lightrag_llamaindex_litellm_demo.py`: LiteLLM proxy integration
- Both examples showcase different search modes (naive, local, global, hybrid)
- Includes configuration for working directory, models, and API settings
- Demonstrates text insertion and querying using LightRAG with LlamaIndex
- removed wrapper directory and references to it
2025-02-20 10:23:01 +01:00
Pankaj Kaushal
203fdf2565 Remove LlamaIndex implementation from llm directory as per @MdNazishArmanShorthillsAI
- Deleted `lightrag/llm/llama_index_impl.py`
- Reorganization of the LlamaIndex wrapper location
2025-02-20 10:23:01 +01:00
Pankaj Kaushal
3b25e32e8d Removed verbose module-level documentation 2025-02-20 10:23:01 +01:00
Pankaj Kaushal
0b94117848 Add LlamaIndex LLM implementation module
- Implemented LlamaIndex interface for language model interactions
- Added async chat completion support
- Included embedding generation functionality
- Implemented retry mechanisms for API calls
- Added configuration and message formatting utilities
- Supports OpenAI-style message handling and external settings
2025-02-20 10:23:01 +01:00
Yannick Stephan
55cd900e8e clean comments and unused libs 2025-02-18 21:12:06 +01:00
Yannick Stephan
ea41d08b9f removed torch from requirement lightrag server 2025-02-18 20:05:51 +01:00
Yannick Stephan
2524e02428 remove tqdm and cleaned readme and ollama 2025-02-18 19:58:03 +01:00
Yannick Stephan
24ae083284 removed never used method 2025-02-18 19:38:04 +01:00
Yannick Stephan
161baa6f08 pm bs4 when ollama 2025-02-18 17:11:17 +01:00
Yannick Stephan
46e1865b98 cleanup code 2025-02-18 16:58:11 +01:00
MdNazishArmanShorthillsAI
44ef234002 Improved variable assignment to use your own azure open ai embedding model 2025-02-17 12:43:51 +05:30
yangdx
b7cce9312f Fix linting 2025-02-17 12:34:54 +08:00
yangdx
d3ff8c3537 Set OpenAI logger level to INFO if VERBOSE_DEBUG is off 2025-02-17 12:20:47 +08:00
yangdx
806eadf5dc Add verbose debug option to control detailed debug output level
• Added VERBOSE env var & CLI flag
• Implemented verbose_debug() function
• Added verbose option to splash screen
• Reduced default debug output length
• Modified LLM debug logging behavior
2025-02-17 01:38:18 +08:00
zrguo
5ffbb548ad Fix linting error 2025-02-11 13:32:24 +08:00
zrguo
2d2ed19095 Fix cache bugs 2025-02-11 13:28:18 +08:00
yangdx
a61db0852a
Fix linting 2025-02-06 23:04:43 +03:00
yangdx
b90f3f14be
Add LightRAG version to User-Agent header for better request tracking
• Add User-Agent header with version info
• Update header creation in Ollama client
• Update header creation in OpenAI client
• Ensure consistent header format
• Include Mozilla UA string for OpenAI
2025-02-06 23:04:19 +03:00
yangdx
f72e4e6830
Enhance OpenAI API error handling and logging for better reliability
• Add InvalidResponseError custom exception
• Improve error logging for API failures
• Add empty response content validation
• Add more detailed debug logging info
• Add retry for invalid response cases
2025-02-06 23:03:31 +03:00
ultrageopro
19ee3d109c
feat: trimming the model’s reasoning 2025-02-06 22:56:17 +03:00
yangdx
290a4d5ec0 Fix linting 2025-02-06 16:24:02 +08:00
yangdx
eb5f57e989 fix: Fix potential mutable default parameter issue 2025-02-06 14:46:07 +08:00
yangdx
24effb127d Improve error handling and response consistency in streaming endpoints
• Add error message forwarding to client
• Handle stream cancellations gracefully
• Add logging for stream errors
• Ensure clean stream termination
• Add try-catch in OpenAI streaming
2025-02-05 10:44:48 +08:00
Saifeddine ALOUI
06c9e4e454 Fixed missing imports bug and fixed linting 2025-01-25 00:55:07 +01:00
Saifeddine ALOUI
34018cb1e0 Separated llms from the main llm.py file and fixed some deprication bugs 2025-01-25 00:11:00 +01:00