ArnoChen
c1d7fbe02b
fix typo
2025-02-09 22:22:44 +08:00
ArnoChen
f974bf39bb
format
...
format
2025-02-08 13:53:00 +08:00
ArnoChen
88d691deb9
add namespace prefix to storage namespaces
2025-02-08 13:53:00 +08:00
yangdx
070501fdaa
Merge branch 'add-keyword-extraction-param-for-llm' into fix-mutable-default-param
2025-02-06 16:22:30 +08:00
yangdx
eb9883d8da
fix: add keyword_extraction param support for LLM func of API Server
2025-02-06 15:56:18 +08:00
yangdx
eb5f57e989
fix: Fix potential mutable default parameter issue
2025-02-06 14:46:07 +08:00
yangdx
fd9b3b2658
Fix splash screen SSL line connector type.
2025-02-06 01:21:42 +08:00
yangdx
1a61d9ee7f
Fix linting
2025-02-05 22:29:07 +08:00
yangdx
f703334ce4
Split the Ollama API implementation to a separated file
2025-02-05 22:15:14 +08:00
yangdx
f77faf8023
Fix linting
2025-02-05 12:36:52 +08:00
yangdx
4663dcfbab
Merge branch 'main' into handle-stream-cancel-error
2025-02-05 12:27:05 +08:00
yangdx
f1ea7f7415
update error response format in streaming API to a normal message. So user can get what's going on.
2025-02-05 11:07:31 +08:00
yangdx
24effb127d
Improve error handling and response consistency in streaming endpoints
...
• Add error message forwarding to client
• Handle stream cancellations gracefully
• Add logging for stream errors
• Ensure clean stream termination
• Add try-catch in OpenAI streaming
2025-02-05 10:44:48 +08:00
yangdx
ff40e61fad
Fix linting
2025-02-05 09:47:39 +08:00
yangdx
69f200faf2
feat: improve error handling for streaming responses
...
• Add CancelledError handling for streams
• Send error details to client in JSON
• Add error status codes and messages
• Always send final completion marker
• Refactor stream generator error handling
2025-02-05 09:46:56 +08:00
ArnoChen
eb77af8e7d
Reapply "Integrated the GraphML Visualizer as an optional component of LightRAG"
...
This reverts commit b2bc7770fd9d1f55dfec8e06b646bda3ecd609ea.
2025-02-05 02:33:26 +08:00
zrguo
b2bc7770fd
Revert "Integrated the GraphML Visualizer as an optional component of LightRAG"
2025-02-05 01:30:57 +08:00
Saifeddine ALOUI
9a30dc7b04
Integrated the graphml visualizer as part of lightrag and made it a component that can be installed using [tools] option
2025-02-03 22:51:46 +01:00
Saifeddine ALOUI
797b5fa463
Merge branch 'HKUDS:main' into main
2025-02-03 22:05:59 +01:00
Saifeddine ALOUI
da6864d9c6
Merge branch 'HKUDS:main' into main
2025-02-03 11:24:08 +01:00
yangdx
5cf875755a
Update API endpoint documentation to clarify Ollama server compatibility
...
• Add Ollama server doc for /api/tags
• Update /api/generate endpoint docs
• Update /api/chat endpoint docs
2025-02-03 13:07:08 +08:00
yangdx
4ab02a878f
Fix linting
2025-02-03 12:39:52 +08:00
yangdx
a8f7b7e2b7
Add "/bypass" mode to skip context retrieval and directly use LLM
...
• Added SearchMode.bypass enum value
• Added /bypass prefix handler
• Skip RAG when in bypass mode
• Pass conversation history to LLM
• Apply bypass mode for both stream/non-stream
2025-02-03 11:49:17 +08:00
Saifeddine ALOUI
c65dcff991
Fixed a typo
2025-02-02 09:47:05 +01:00
yangdx
7ea1856699
Add comment to clarify LLM cache setting for entity extraction
2025-02-02 07:29:01 +08:00
yangdx
6e1b5d6ce6
Merge branch 'main' into fix-concurrent-problem
2025-02-02 04:36:52 +08:00
yangdx
ecf48a5be5
Add embedding cache config and disable LLM cache for entity extraction for API Server
2025-02-02 04:27:21 +08:00
yangdx
95edf8a51e
Fix linting
2025-02-01 15:22:40 +08:00
Saifeddine ALOUI
3a40772d30
Simplified file loading
2025-02-01 01:19:32 +01:00
Saifeddine ALOUI
e09cb85f37
fixed linting as well as file path
2025-02-01 01:15:06 +01:00
Saifeddine ALOUI
ef35f9a4e4
Introduced docling instead of other tools for loading files
2025-02-01 00:56:43 +01:00
zrguo
e59cb7493c
fixed linting
2025-01-31 23:35:42 +08:00
Saifeddine ALOUI
6889606a48
Update lightrag_server.py
2025-01-31 11:19:12 +01:00
Saifeddine ALOUI
381f7deec6
linting
2025-01-30 23:29:21 +01:00
Saifeddine ALOUI
219cbab1e3
Added progress when scanning files and fixed some bugs in the API
2025-01-30 23:27:43 +01:00
yangdx
5c7b2d7c9f
Merge from main
2025-01-29 22:13:01 +08:00
yangdx
7ff8c7b9d8
Add timeout parameter to OpenAI alike LLM model configuration
2025-01-29 21:35:46 +08:00
yangdx
7aedc08caf
Add RAG configuration options and enhance parameter configurability
...
- Add top-k and cosine-threshold parms for api server
- Update .env and cli parms handling with new parameters
- Improve splash screen display
- Update bash and storage classes to read new parameters from .env file.
2025-01-29 21:34:34 +08:00
Saifeddine ALOUI
b5d09725f5
linting
2025-01-28 18:20:45 +01:00
Saifeddine ALOUI
4ab1deaf25
Fixed --simulated-model-name argument
2025-01-28 15:32:41 +01:00
Saifeddine ALOUI
d493830b10
Update lightrag_server.py
2025-01-28 15:30:36 +01:00
Saifeddine ALOUI
9fdea743e4
Update lightrag_server.py
2025-01-28 15:03:26 +01:00
zrguo
7dd6bd4ceb
Merge branch 'main' into main
2025-01-27 23:16:06 +08:00
zrguo
ba40a8de8b
Merge pull request #659 from danielaskdd/cvs_robustness
...
Enhance robustness of CVS processing ,Fix potential CSV parsing issues
2025-01-27 23:12:43 +08:00
Saifeddine ALOUI
0721ee303c
Fixed files list
2025-01-27 12:02:22 +01:00
yangdx
c8d384f15f
Fix linting
2025-01-27 16:12:30 +08:00
yangdx
6d61b37c03
Add type ignore for pptx import
2025-01-27 16:07:03 +08:00
Saifeddine ALOUI
f307ed43f5
fixed linting
2025-01-27 02:10:24 +01:00
Saifeddine ALOUI
a68aebb124
translated docstrings to english and enhanced the webui
2025-01-27 02:07:06 +01:00
zrguo
bd2b3f334e
Merge pull request #650 from danielaskdd/Add-history-support-for-ollama-api
...
Add history support for ollama api
2025-01-27 06:34:10 +08:00