* Make InMemoryDocumentStore accept and apply filters in delete_documents()
* Modify test_document_store.py to test the filtered deletion in memory, sql and milvus too
* Make FAISSDocumentStore accept and properly apply filters in delete_documents()
* Add latest docstring and tutorial changes
* Remove accidentally duplicated test
* Remove unnecessary decorators from test/test_document_store.py::test_delete_documents_with_filters
* Add embeddings count test for FAISS and Milvus; Milvus fails it.
* Fixed a bug that made Milvus not deleting embeddings
* Remove batch size parametrization in tests & update all documentstore's docstrings with a filter example
* Add latest docstring and tutorial changes
Co-authored-by: prafgup <prafulgupta6@gmail.com>
* Saves the FAISSDocumentStore init params to JSON at save() and loads them at load() if they're found. First draft, to be tested.
* Fixing issue with string/Path objects in a few string operations, thanks mypy
* Leverage self.set_config instead of saving the parameters in a separate attribute
* Modify test_faiss_and_milvus:test_faiss_index_save_and_load to test that init params are preserved
* Add assert to verify that the SQL doc count and FAISS vector count is equal. Needs to always specify the name of the SQL db for this to work
* Simplified the implementation a bit, add better comments
* Forgot a return at the end of the file
* Fixing some of the suggestions from the review
* Add a try-catch in the load method and fix the tests
* Typo
* feat: normalize embeddings for cosine sim
* WIP add test case for faiss cosine
* input to faiss normalize needs to be an array of vectors
* fix: test should compare correct result embedding to original embedding
* add sanity check for cosine sim
* fix typo
* normalize cosine score
* Update docstring
Co-authored-by: Malte Pietsch <malte.pietsch@deepset.ai>
* [UPDT] delete_all_documents() replaced by delete_documents()
* [UPDT] warning logs to be fixed
* [UPDT] delete_all_documents() renamed and the same method added
Co-authored-by: Ram Garg <ramgarg102@gmai.com>