* move flowsettings.py and launch.py to root
* update docs
* sync sub package versions
* rename launch.py to app.py and make run scripts work with installation package
* add update scripts
* auto version for root package
* rename authors and update doc dir
* Update auto-bump-and-release.yaml to trigger on push to main branch
* latest as branch instead of tag
* pin deps versions
* cache the changelogs
* Provide the Embedding management UI
* Update Fastembed documentation
* Add validation when adding / updating embeddings
* Stop using the old ktem embeddings manager
* Set default local embedding models
* Move the local embeddings below in flowsettings
* Update flowsettings
* Rename AzureChatOpenAI to LCAzureChatOpenAI
* Provide vanilla ChatOpenAI and AzureChatOpenAI
* Remove the highest accuracy, lowest cost criteria
These criteria are unnecessary. The users, not pipeline creators, should choose
which LLM to use. Furthermore, it's cumbersome to input this information,
really degrades user experience.
* Remove the LLM selection in simple reasoning pipeline
* Provide a dedicated stream method to generate the output
* Return placeholder message to chat if the text is empty
- Include static files in the package.
- More reliable information panel. Faster & not breaking randomly.
- Add directory upload.
- Enable zip file to upload.
- Allow setting endpoint for the OCR reader using environment variable.