yangdx
56f82bdcd5
Ensure OpenAI connection is closed after streaming response finished
2025-05-12 17:37:28 +08:00
yangdx
c2938a71a4
Fix streaming problem for OpenAI
2025-05-09 15:54:54 +08:00
Arjun Rao
b7eae4d7c0
Use the context manager for the openai client
...
This avoids issues of resource cleanup (too many open files) when dealing with massively parallel calls to the openai API since RAII in python is highly unreliable in such contexts.
2025-05-08 11:42:53 +10:00
yangdx
34cc8b6a51
Fix linting
2025-04-29 17:52:07 +08:00
yangdx
f58c8276bc
fix: correct retry_if_exception_type usage and improve async iterator resource management
...
- Corrects the syntax of retry_if_exception_type decorators to ensure proper exception handling and retry behavior
- Implements proper resource cleanup for async iterators to prevent memory leaks and potential SIGSEGV errors
2025-04-29 17:43:27 +08:00
yangdx
39540f3f8b
Fix linting
2025-04-20 14:33:33 +08:00
yangdx
5f2cd871a8
Update sample code and README
2025-04-20 14:33:16 +08:00
yangdx
a418b18ed1
Fix linting
2025-04-20 11:17:51 +08:00
Enoughappens
704ef16ce3
fix streaming "list index out of range"
2025-04-19 12:57:08 +08:00
yangdx
14b4bc96ce
Fix OPENAI_API_BASE not working in .env
2025-04-17 05:20:22 +08:00
Qodi
8f3068f1c0
Update openai.py
...
增加环境变量,支持OPENAI_API_BASE 支持中转站
2025-04-10 12:10:35 +08:00
zrguo
e17e61f58e
fix lint
2025-04-03 14:44:56 +08:00
zrguo
9648300b18
Merge pull request #1208 from shane-lil/openai-client-config
...
feat(openai): add client configuration support to OpenAI integration
2025-04-03 17:43:57 +11:00
yangdx
80335d57a5
Fix linting
2025-03-28 21:43:47 +08:00
yangdx
491c78dac1
Improve OpenAI LLM logging with more detailed debug information
2025-03-28 21:33:59 +08:00
Shane Walker
d45dc14069
feat(openai): add client configuration support to OpenAI integration
...
Add support for custom client configurations in the OpenAI integration,
allowing for more flexible configuration of the AsyncOpenAI client.
This includes:
- Create a reusable helper function `create_openai_async_client`
- Add proper documentation for client configuration options
- Ensure consistent parameter precedence across the codebase
- Update the embedding function to support client configurations
- Add example script demonstrating custom client configuration usage
The changes maintain backward compatibility while providing a cleaner
and more maintainable approach to configuring OpenAI clients.
2025-03-27 15:39:39 -07:00
choizhang
8488229a29
feat: Add TokenTracker to track token usage for LLM calls
2025-03-28 01:25:15 +08:00
zrguo
adba09f6c2
fix stream
2025-03-17 11:41:55 +08:00
zrguo
f5ab76dc4c
fix linting
2025-03-14 14:10:59 +08:00
lvyb
87474f7b2c
fix stream
2025-03-12 16:57:51 +08:00
Yannick Stephan
55cd900e8e
clean comments and unused libs
2025-02-18 21:12:06 +01:00
Yannick Stephan
46e1865b98
cleanup code
2025-02-18 16:58:11 +01:00
yangdx
b7cce9312f
Fix linting
2025-02-17 12:34:54 +08:00
yangdx
d3ff8c3537
Set OpenAI logger level to INFO if VERBOSE_DEBUG is off
2025-02-17 12:20:47 +08:00
yangdx
806eadf5dc
Add verbose debug option to control detailed debug output level
...
• Added VERBOSE env var & CLI flag
• Implemented verbose_debug() function
• Added verbose option to splash screen
• Reduced default debug output length
• Modified LLM debug logging behavior
2025-02-17 01:38:18 +08:00
zrguo
5ffbb548ad
Fix linting error
2025-02-11 13:32:24 +08:00
zrguo
2d2ed19095
Fix cache bugs
2025-02-11 13:28:18 +08:00
yangdx
8fdba5d4db
Fix linting
2025-02-06 23:12:35 +08:00
yangdx
2760433634
Add LightRAG version to User-Agent header for better request tracking
...
• Add User-Agent header with version info
• Update header creation in Ollama client
• Update header creation in OpenAI client
• Ensure consistent header format
• Include Mozilla UA string for OpenAI
2025-02-06 22:55:22 +08:00
yangdx
506e39e14e
Enhance OpenAI API error handling and logging for better reliability
...
• Add InvalidResponseError custom exception
• Improve error logging for API failures
• Add empty response content validation
• Add more detailed debug logging info
• Add retry for invalid response cases
2025-02-06 19:42:57 +08:00
yangdx
290a4d5ec0
Fix linting
2025-02-06 16:24:02 +08:00
yangdx
eb5f57e989
fix: Fix potential mutable default parameter issue
2025-02-06 14:46:07 +08:00
yangdx
24effb127d
Improve error handling and response consistency in streaming endpoints
...
• Add error message forwarding to client
• Handle stream cancellations gracefully
• Add logging for stream errors
• Ensure clean stream termination
• Add try-catch in OpenAI streaming
2025-02-05 10:44:48 +08:00
Saifeddine ALOUI
06c9e4e454
Fixed missing imports bug and fixed linting
2025-01-25 00:55:07 +01:00
Saifeddine ALOUI
34018cb1e0
Separated llms from the main llm.py file and fixed some deprication bugs
2025-01-25 00:11:00 +01:00