212 Commits

Author SHA1 Message Date
yangdx
65dc0a6cfd Fix linting 2025-02-06 02:50:27 +08:00
yangdx
e26c6e564d refactor: enhance stream error handling and optimize code structure
- Initialize timestamps at start to avoid null checks
- Add detailed error handling for streaming response
- Handle CancelledError and other exceptions separately
- Unify exception handling with trace_exception
- Clean up redundant code and simplify logic
2025-02-06 02:43:06 +08:00
yangdx
fd9b3b2658 Fix splash screen SSL line connector type. 2025-02-06 01:21:42 +08:00
yangdx
db9b4dc841 Added environment variable loading with dotenv in Ollama API 2025-02-06 01:00:49 +08:00
yangdx
1a61d9ee7f Fix linting 2025-02-05 22:29:07 +08:00
yangdx
f703334ce4 Split the Ollama API implementation to a separated file 2025-02-05 22:15:14 +08:00
yangdx
f77faf8023 Fix linting 2025-02-05 12:36:52 +08:00
yangdx
4663dcfbab Merge branch 'main' into handle-stream-cancel-error 2025-02-05 12:27:05 +08:00
yangdx
f1ea7f7415 update error response format in streaming API to a normal message. So user can get what's going on. 2025-02-05 11:07:31 +08:00
yangdx
24effb127d Improve error handling and response consistency in streaming endpoints
• Add error message forwarding to client
• Handle stream cancellations gracefully
• Add logging for stream errors
• Ensure clean stream termination
• Add try-catch in OpenAI streaming
2025-02-05 10:44:48 +08:00
yangdx
ff40e61fad Fix linting 2025-02-05 09:47:39 +08:00
yangdx
69f200faf2 feat: improve error handling for streaming responses
• Add CancelledError handling for streams
• Send error details to client in JSON
• Add error status codes and messages
• Always send final completion marker
• Refactor stream generator error handling
2025-02-05 09:46:56 +08:00
ArnoChen
eb77af8e7d Reapply "Integrated the GraphML Visualizer as an optional component of LightRAG"
This reverts commit b2bc7770fd9d1f55dfec8e06b646bda3ecd609ea.
2025-02-05 02:33:26 +08:00
zrguo
b2bc7770fd
Revert "Integrated the GraphML Visualizer as an optional component of LightRAG" 2025-02-05 01:30:57 +08:00
Saifeddine ALOUI
9a30dc7b04 Integrated the graphml visualizer as part of lightrag and made it a component that can be installed using [tools] option 2025-02-03 22:51:46 +01:00
Saifeddine ALOUI
797b5fa463
Merge branch 'HKUDS:main' into main 2025-02-03 22:05:59 +01:00
Saifeddine ALOUI
da6864d9c6
Merge branch 'HKUDS:main' into main 2025-02-03 11:24:08 +01:00
yangdx
5cf875755a Update API endpoint documentation to clarify Ollama server compatibility
• Add Ollama server doc for /api/tags
• Update /api/generate endpoint docs
• Update /api/chat endpoint docs
2025-02-03 13:07:08 +08:00
yangdx
4ab02a878f Fix linting 2025-02-03 12:39:52 +08:00
yangdx
ede4122b63 docs: add documentation for /bypass prefix in LightRAG api 2025-02-03 12:25:59 +08:00
yangdx
a8f7b7e2b7 Add "/bypass" mode to skip context retrieval and directly use LLM
• Added SearchMode.bypass enum value
• Added /bypass prefix handler
• Skip RAG when in bypass mode
• Pass conversation history to LLM
• Apply bypass mode for both stream/non-stream
2025-02-03 11:49:17 +08:00
Saifeddine ALOUI
c65dcff991 Fixed a typo 2025-02-02 09:47:05 +01:00
yangdx
7ea1856699 Add comment to clarify LLM cache setting for entity extraction 2025-02-02 07:29:01 +08:00
yangdx
6e1b5d6ce6 Merge branch 'main' into fix-concurrent-problem 2025-02-02 04:36:52 +08:00
yangdx
ecf48a5be5 Add embedding cache config and disable LLM cache for entity extraction for API Server 2025-02-02 04:27:21 +08:00
yangdx
95edf8a51e Fix linting 2025-02-01 15:22:40 +08:00
Saifeddine ALOUI
3a40772d30 Simplified file loading 2025-02-01 01:19:32 +01:00
Saifeddine ALOUI
e09cb85f37 fixed linting as well as file path 2025-02-01 01:15:06 +01:00
Saifeddine ALOUI
ef35f9a4e4 Introduced docling instead of other tools for loading files 2025-02-01 00:56:43 +01:00
zrguo
e59cb7493c fixed linting 2025-01-31 23:35:42 +08:00
Saifeddine ALOUI
78b858c03b
Finished testing api key 2025-01-31 16:19:46 +01:00
Saifeddine ALOUI
d2a550fd31
Update api.js 2025-01-31 16:08:23 +01:00
Saifeddine ALOUI
d1210851aa
Update api.js 2025-01-31 16:07:27 +01:00
Saifeddine ALOUI
e9591548b4
Update api.js 2025-01-31 16:03:31 +01:00
Saifeddine ALOUI
2444975bf1
Update api.js 2025-01-31 13:22:19 +01:00
Saifeddine ALOUI
6889606a48
Update lightrag_server.py 2025-01-31 11:19:12 +01:00
Saifeddine ALOUI
381f7deec6 linting 2025-01-30 23:29:21 +01:00
Saifeddine ALOUI
219cbab1e3 Added progress when scanning files and fixed some bugs in the API 2025-01-30 23:27:43 +01:00
yangdx
46c9c7d95b Update sample env file and documentation
- Change COSINE_THRESHOLD to 0.4
- Adjust TOP_K to 50
- Enhance API README details
2025-01-29 23:45:20 +08:00
yangdx
5c7b2d7c9f Merge from main 2025-01-29 22:13:01 +08:00
yangdx
7ff8c7b9d8 Add timeout parameter to OpenAI alike LLM model configuration 2025-01-29 21:35:46 +08:00
yangdx
7aedc08caf Add RAG configuration options and enhance parameter configurability
- Add top-k and cosine-threshold parms for api server
- Update .env and cli parms handling with new parameters
- Improve splash screen display
- Update bash and storage classes to read new parameters from .env file.
2025-01-29 21:34:34 +08:00
Saifeddine ALOUI
b5d09725f5 linting 2025-01-28 18:20:45 +01:00
Saifeddine ALOUI
4ab1deaf25
Fixed --simulated-model-name argument 2025-01-28 15:32:41 +01:00
Saifeddine ALOUI
d493830b10
Update lightrag_server.py 2025-01-28 15:30:36 +01:00
Saifeddine ALOUI
9fdea743e4
Update lightrag_server.py 2025-01-28 15:03:26 +01:00
zrguo
80451af839 fix linting errors 2025-01-27 23:21:34 +08:00
zrguo
7dd6bd4ceb
Merge branch 'main' into main 2025-01-27 23:16:06 +08:00
zrguo
ba40a8de8b
Merge pull request #659 from danielaskdd/cvs_robustness
Enhance robustness of CVS processing ,Fix potential CSV parsing issues
2025-01-27 23:12:43 +08:00
Saifeddine ALOUI
e4b2a5956e
Upgraded ui 2025-01-27 12:49:12 +01:00