Update README.rst

This commit is contained in:
Tanay Soni 2020-06-08 16:42:31 +02:00 committed by GitHub
parent 97bbb4280c
commit ebb9e99b6c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -133,9 +133,10 @@ ElasticsearchRetriever
Scoring text similarity via sparse Bag-of-words representations are strong and well-established baselines in Information Retrieval.
The default :code:`ElasticsearchRetriever` uses Elasticsearch's native scoring (BM25), but can be extended easily with custom queries or filtering.
Example::
Example
.. code-block:: python
retriever = ElasticsearchRetriever(document_store=document_store, custom_query=None)
retriever.retrieve(query="Why did the revenue increase?", filters={"years": ["2019"], "company": ["Q1", "Q2"]})
# returns: [Document, Document]
@ -145,9 +146,10 @@ EmbeddingRetriever
Using dense embeddings (i.e. vector representations) of texts is a powerful alternative to score similarity of texts.
This retriever allows you to transform your query into an embedding using a model (e.g. Sentence-BERT) and find similar texts by using cosine similarity.
Example::
Example
.. code-block:: python
retriever = EmbeddingRetriever(document_store=document_store,
embedding_model="deepset/sentence-bert",
model_format="farm")
@ -168,9 +170,10 @@ Both readers can load either a local model or any public model from `Hugging Fa
FARMReader
^^^^^^^^^^
Implementing various QA models via the `FARM <https://github.com/deepset-ai/FARM>`_ Framework.
Example::
Example
.. code-block:: python
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2",
use_gpu=False, no_ans_boost=-10, context_window_size=500,
top_k_per_candidate=3, top_k_per_sample=1,
@ -194,9 +197,10 @@ TransformersReader
^^^^^^^^^^^^^^^^^^
Implementing various QA models via the :code:`pipeline` class of `Transformers <https://github.com/huggingface/transformers>`_ Framework.
Example::
Example
.. code-block:: python
reader = TransformersReader(model="distilbert-base-uncased-distilled-squad",
tokenizer="distilbert-base-uncased",
context_window_size=500,