Markus Paff aee90c5df9
Docs v0.7.0 (#757)
* new docs version

* Add latest docstring and tutorial changes

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2021-01-22 10:28:33 +01:00

1.1 KiB

Generator

Example

See Tutorial 7 for a guide on how to build your own generative QA system.

While extractive QA highlights the span of text that answers a query, generative QA can return a novel text answer that it has composed. The best current approaches, such as Retriever-Augmented Generation, can draw upon both the knowledge it gained during language model pretraining (parametric memory) as well as passages provided to it with a retriever (non-parametric memory). With the advent of Transformer based retrieval methods such as Dense Passage Retrieval, retriever and generator can be trained concurrently from the one loss signal.

Pros

  • More appropriately phrased answers
  • Able to syntehsize information from different texts
  • Can draw on latent knowledge stored in language model

Cons

  • Not easy to track what piece of information the generator is basing its response off of