mirror of
https://github.com/deepset-ai/haystack.git
synced 2026-01-06 03:57:19 +00:00
Update Docker README.md (#7369)
* Update Docker README.md * mention 1.x/2.0 --------- Co-authored-by: Massimiliano Pippi <mpippi@gmail.com>
This commit is contained in:
parent
c3b96392fd
commit
b8fc86eb6a
@ -1,25 +1,19 @@
|
||||
<p align="center">
|
||||
<a href="https://www.deepset.ai/haystack/"><img src="https://raw.githubusercontent.com/deepset-ai/haystack/main/docs/img/haystack_logo_colored.png" alt="Haystack"></a>
|
||||
<a href="https://haystack.deepset.ai/"><img src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" alt="Haystack by deepset"></a>
|
||||
</p>
|
||||
|
||||
Haystack is an end-to-end framework that enables you to build powerful and production-ready
|
||||
pipelines for different search use cases. The Docker image comes with a web service
|
||||
configured to serve Haystack's `rest_api` to ease pipeline deployments in containerized
|
||||
environments.
|
||||
[Haystack](https://github.com/deepset-ai/haystack) is an end-to-end LLM framework that allows you to build applications powered by LLMs, Transformer models, vector search and more. Whether you want to perform retrieval-augmented generation (RAG), document search, question answering or answer generation, Haystack can orchestrate state-of-the-art embedding models and LLMs into pipelines to build end-to-end NLP applications and solve your use case.
|
||||
|
||||
To start the Docker container binding the TCP port `8000` locally, run:
|
||||
```sh
|
||||
docker run -p 8000:8000 deepset/haystack
|
||||
```
|
||||
## Haystack 2.0
|
||||
|
||||
If you need the container to access other services available in the host, run:
|
||||
```sh
|
||||
docker run -p 8000:8000 --network="host" deepset/haystack
|
||||
```
|
||||
For the latest version of Haystack there's only one image available:
|
||||
|
||||
## Image Variants
|
||||
- `haystack:base-<version>` contains a working Python environment with Haystack preinstalled. This image is expected to
|
||||
be derived `FROM`.
|
||||
|
||||
The Docker image comes in six variants:
|
||||
## Haystack 1.x image variants
|
||||
|
||||
The Docker image for Haystack 1.x comes in six variants:
|
||||
- `haystack:gpu-<version>` contains Haystack dependencies as well as what's needed to run the REST API and UI. It comes with the CUDA runtime and is capable of running on GPUs.
|
||||
- `haystack:cpu-remote-inference-<version>` is a slimmed down version of the CPU image with the REST API and UI. It is specifically designed for PromptNode inferencing using remotely hosted models, such as Hugging Face Inference, OpenAI, Cohere, Anthropic, and similar.
|
||||
- `haystack:cpu-<version>` contains Haystack dependencies as well as what's needed to run the REST API and UI. It has no support for GPU so must be run on CPU.
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user