290 Commits

Author SHA1 Message Date
ArnoChen
d418ceee82 add label filter 2025-02-11 06:48:04 +08:00
yangdx
cddde8053d Add configuration examples for Oracle, TiDB, PostgreSQL and storage backends 2025-02-11 06:31:59 +08:00
yangdx
56c1792767 feat optimize storage configuration and environment variables
* add storage type compatibility validation table
* add enviroment variables check for storage
* modify storage init to get setting from confing.ini and env
2025-02-11 00:55:52 +08:00
ArnoChen
4d4ace295a remove test button 2025-02-10 23:57:26 +08:00
ArnoChen
beee79d013 better error handing 2025-02-10 23:44:58 +08:00
ArnoChen
7d6ffbbd87 implement backend health check and alert system 2025-02-10 23:33:51 +08:00
ArnoChen
a08f59f663 add properties view 2025-02-10 22:02:06 +08:00
ArnoChen
07f19b939c Merge branch 'main' into graph-viewer-webui 2025-02-10 13:56:25 +08:00
ArnoChen
a55b3263f1 move config.ini.example to root directory 2025-02-10 01:30:55 +08:00
ArnoChen
ee5c00e0c6 add example configuration file for API
format
2025-02-10 01:23:41 +08:00
ArnoChen
09195182c2 enable MongoGraphStorage based on config
mongo graph
2025-02-10 01:07:41 +08:00
ArnoChen
294d094076 use "database" instead of "LightRAG" for MongoDB config 2025-02-10 01:00:02 +08:00
ArnoChen
c858e4a4e6 add qdrant backend 2025-02-10 00:57:28 +08:00
ArnoChen
10a0fa1530 use relative path for graph API endpoint 2025-02-10 00:38:04 +08:00
ArnoChen
1c7f918d8c format
format
2025-02-10 00:33:59 +08:00
ArnoChen
09ee968926 fix mount 2025-02-09 23:28:42 +08:00
ArnoChen
489d7f6c12 mount graph viewer webui in /webui endpoint 2025-02-09 23:17:30 +08:00
ArnoChen
b5eb51a861 add graph viewer webui 2025-02-09 23:17:26 +08:00
ArnoChen
c1d7fbe02b fix typo 2025-02-09 22:22:44 +08:00
ArnoChen
f974bf39bb format
format
2025-02-08 13:53:00 +08:00
ArnoChen
88d691deb9 add namespace prefix to storage namespaces 2025-02-08 13:53:00 +08:00
yangdx
0890c6ad7e update docs for service isntallation 2025-02-07 11:05:38 +08:00
yangdx
81377da418 Merge tag 'time-temp' into improve-ollama-api-streaming 2025-02-06 23:00:32 +08:00
yangdx
070501fdaa Merge branch 'add-keyword-extraction-param-for-llm' into fix-mutable-default-param 2025-02-06 16:22:30 +08:00
yangdx
eb9883d8da fix: add keyword_extraction param support for LLM func of API Server 2025-02-06 15:56:18 +08:00
yangdx
eb5f57e989 fix: Fix potential mutable default parameter issue 2025-02-06 14:46:07 +08:00
yangdx
9103e7f463 fix: improve timing accuracy and variable scoping in OllamaAPI 2025-02-06 10:42:49 +08:00
yangdx
e124ad7f9c Fix timing calculation logic in OllamaAPI stream generators
• Initialize first_chunk_time as None
• Set timing only when first chunk arrives
2025-02-06 04:53:05 +08:00
yangdx
65dc0a6cfd Fix linting 2025-02-06 02:50:27 +08:00
yangdx
e26c6e564d refactor: enhance stream error handling and optimize code structure
- Initialize timestamps at start to avoid null checks
- Add detailed error handling for streaming response
- Handle CancelledError and other exceptions separately
- Unify exception handling with trace_exception
- Clean up redundant code and simplify logic
2025-02-06 02:43:06 +08:00
yangdx
fd9b3b2658 Fix splash screen SSL line connector type. 2025-02-06 01:21:42 +08:00
yangdx
db9b4dc841 Added environment variable loading with dotenv in Ollama API 2025-02-06 01:00:49 +08:00
yangdx
1a61d9ee7f Fix linting 2025-02-05 22:29:07 +08:00
yangdx
f703334ce4 Split the Ollama API implementation to a separated file 2025-02-05 22:15:14 +08:00
yangdx
f77faf8023 Fix linting 2025-02-05 12:36:52 +08:00
yangdx
4663dcfbab Merge branch 'main' into handle-stream-cancel-error 2025-02-05 12:27:05 +08:00
yangdx
f1ea7f7415 update error response format in streaming API to a normal message. So user can get what's going on. 2025-02-05 11:07:31 +08:00
yangdx
24effb127d Improve error handling and response consistency in streaming endpoints
• Add error message forwarding to client
• Handle stream cancellations gracefully
• Add logging for stream errors
• Ensure clean stream termination
• Add try-catch in OpenAI streaming
2025-02-05 10:44:48 +08:00
yangdx
ff40e61fad Fix linting 2025-02-05 09:47:39 +08:00
yangdx
69f200faf2 feat: improve error handling for streaming responses
• Add CancelledError handling for streams
• Send error details to client in JSON
• Add error status codes and messages
• Always send final completion marker
• Refactor stream generator error handling
2025-02-05 09:46:56 +08:00
ArnoChen
eb77af8e7d Reapply "Integrated the GraphML Visualizer as an optional component of LightRAG"
This reverts commit b2bc7770fd9d1f55dfec8e06b646bda3ecd609ea.
2025-02-05 02:33:26 +08:00
zrguo
b2bc7770fd
Revert "Integrated the GraphML Visualizer as an optional component of LightRAG" 2025-02-05 01:30:57 +08:00
Saifeddine ALOUI
9a30dc7b04 Integrated the graphml visualizer as part of lightrag and made it a component that can be installed using [tools] option 2025-02-03 22:51:46 +01:00
Saifeddine ALOUI
797b5fa463
Merge branch 'HKUDS:main' into main 2025-02-03 22:05:59 +01:00
Saifeddine ALOUI
da6864d9c6
Merge branch 'HKUDS:main' into main 2025-02-03 11:24:08 +01:00
yangdx
5cf875755a Update API endpoint documentation to clarify Ollama server compatibility
• Add Ollama server doc for /api/tags
• Update /api/generate endpoint docs
• Update /api/chat endpoint docs
2025-02-03 13:07:08 +08:00
yangdx
4ab02a878f Fix linting 2025-02-03 12:39:52 +08:00
yangdx
ede4122b63 docs: add documentation for /bypass prefix in LightRAG api 2025-02-03 12:25:59 +08:00
yangdx
a8f7b7e2b7 Add "/bypass" mode to skip context retrieval and directly use LLM
• Added SearchMode.bypass enum value
• Added /bypass prefix handler
• Skip RAG when in bypass mode
• Pass conversation history to LLM
• Apply bypass mode for both stream/non-stream
2025-02-03 11:49:17 +08:00
Saifeddine ALOUI
c65dcff991 Fixed a typo 2025-02-02 09:47:05 +01:00