* fix: warm up individual tools inside Toolsets in warm_up_tools() Related Issues: * Follows up on PR #9942 (feat: Add warm_up() method to ChatGenerators) * Addresses bug discovered during implementation of PR #9942 for issue #9907 Proposed Changes: The warm_up_tools() utility function was only calling warm_up() on Toolset objects themselves, but not on the individual Tool instances contained within them. This meant tools inside a Toolset were not properly initialized before use. This PR modifies warm_up_tools() to iterate through Toolsets and call warm_up() on each individual tool, in addition to calling warm_up() on the Toolset itself. Changes: - Modified warm_up_tools() in haystack/tools/utils.py to iterate through Toolsets when encountered (both as single argument and within lists) - Added iteration to call warm_up() on each individual Tool inside Toolsets - Added comprehensive test class TestWarmUpTools with 7 test cases How did you test it: - Added 7 comprehensive unit tests in test/tools/test_tools_utils.py: * test_warm_up_tools_with_none - handles None input * test_warm_up_tools_with_single_tool - single tool in list * test_warm_up_tools_with_single_toolset - KEY TEST: verifies both Toolset and individual tools are warmed * test_warm_up_tools_with_list_containing_toolset - toolset within list * test_warm_up_tools_with_multiple_toolsets - multiple toolsets * test_warm_up_tools_with_mixed_tools_and_toolsets - mixed scenarios * test_warm_up_tools_idempotency - safe to call multiple times Notes for the reviewer: I discovered this bug while implementing PR #9942 (for issue #9907). When a Toolset object is passed to a component's tools parameter, the warm_up_tools() function only calls Toolset.warm_up(), which is a no-op. It doesn't iterate through the individual tools inside the Toolset to warm them up. acknowledged by @vblagoje and @sjrl This implementation: - Modified warm_up_tools() to iterate through Toolsets and call warm_up() on each individual tool - Added comprehensive tests for Toolset warming behavior - Verified both the Toolset and its contained tools are warmed up Checklist: I have read the contributors guidelines and the code of conduct I have updated the related issue with new insights and changes I added unit tests and updated the docstrings I've used one of the conventional commit types for my PR title: fix: I documented my code I ran pre-commit hooks and fixed any issue * added release note * refactor: move tool warm-up iteration to Toolset.warm_up() Addresses architectural feedback - moved iteration logic from warm_up_tools() to base Toolset.warm_up() for better encapsulation. Subclasses can now override warm_up() to customize initialization without breaking the contract. - Toolset.warm_up() now iterates and warms tools by default - warm_up_tools() simplified to delegate to warm_up() - Updated tests and release notes --------- Co-authored-by: HamidOna13 <abdulhamid.onawole@aizatron.com>
Haystack is an end-to-end LLM framework that allows you to build applications powered by LLMs, Transformer models, vector search and more. Whether you want to perform retrieval-augmented generation (RAG), document search, question answering or answer generation, Haystack can orchestrate state-of-the-art embedding models and LLMs into pipelines to build end-to-end NLP applications and solve your use case.
Table of Contents
- Installation
- Documentation
- Features
- Use Cases
- Hayhooks (REST API Deployment)
- Haystack Enterprise
- deepset Studio
- Telemetry
- 🖖 Community
- Contributing to Haystack
- Who Uses Haystack
Installation
The simplest way to get Haystack is via pip:
pip install haystack-ai
Install from the main branch to try the newest features:
pip install git+https://github.com/deepset-ai/haystack.git@main
Haystack supports multiple installation methods including Docker images. For a comprehensive guide please refer to the documentation.
Documentation
If you're new to the project, check out "What is Haystack?" then go through the "Get Started Guide" and build your first LLM application in a matter of minutes. Keep learning with the tutorials. For more advanced use cases, or just to get some inspiration, you can browse our Haystack recipes in the Cookbook.
At any given point, hit the documentation to learn more about Haystack, what can it do for you and the technology behind.
Features
- Technology agnostic: Allow users the flexibility to decide what vendor or technology they want and make it easy to switch out any component for another. Haystack allows you to use and compare models available from OpenAI, Cohere and Hugging Face, as well as your own local models or models hosted on Azure, Bedrock and SageMaker.
- Explicit: Make it transparent how different moving parts can “talk” to each other so it's easier to fit your tech stack and use case.
- Flexible: Haystack provides all tooling in one place: database access, file conversion, cleaning, splitting, training, eval, inference, and more. And whenever custom behavior is desirable, it's easy to create custom components.
- Extensible: Provide a uniform and easy way for the community and third parties to build their own components and foster an open ecosystem around Haystack.
Some examples of what you can do with Haystack:
- Build retrieval augmented generation (RAG) by making use of one of the available vector databases and customizing your LLM interaction, the sky is the limit 🚀
- Perform Question Answering in natural language to find granular answers in your documents.
- Perform semantic search and retrieve documents according to meaning.
- Build applications that can make complex decisions making to answer complex queries: such as systems that can resolve complex customer queries, do knowledge search on many disconnected resources and so on.
- Scale to millions of docs using retrievers and production-scale components.
- Use off-the-shelf models or fine-tune them to your data.
- Use user feedback to evaluate, benchmark, and continuously improve your models.
Tip
Would you like to deploy and serve Haystack pipelines as REST APIs yourself? Hayhooks provides a simple way to wrap your pipelines with custom logic and expose them via HTTP endpoints, including OpenAI-compatible chat completion endpoints and compatibility with fully-featured chat interfaces like open-webui.
Haystack Enterprise: Best Practices and Expert Support
Get expert support from the Haystack team, build faster with enterprise-grade templates, and scale securely with deployment guides for cloud and on-prem environments - all with Haystack Enterprise. Read more about it our announcement post.
deepset Studio: Your Development Environment for Haystack
Use deepset Studio to visually create, deploy, and test your Haystack pipelines. Learn more about it in our announcement post.
👉 Sign up!
Tip
Are you looking for a managed solution that benefits from Haystack? deepset AI Platform is our fully managed, end-to-end platform to integrate LLMs with your data, which uses Haystack for the LLM pipelines architecture.
Telemetry
Haystack collects anonymous usage statistics of pipeline components. We receive an event every time these components are initialized. This way, we know which components are most relevant to our community.
Read more about telemetry in Haystack or how you can opt out in Haystack docs.
🖖 Community
If you have a feature request or a bug report, feel free to open an issue in Github. We regularly check these and you can expect a quick response. If you'd like to discuss a topic, or get more general advice on how to make Haystack work for your project, you can start a thread in Github Discussions or our Discord channel. We also check 𝕏 (Twitter) and Stack Overflow.
Contributing to Haystack
We are very open to the community's contributions - be it a quick fix of a typo, or a completely new feature! You don't need to be a Haystack expert to provide meaningful improvements. To learn how to get started, check out our Contributor Guidelines first.
There are several ways you can contribute to Haystack:
- Contribute to the main Haystack project
- Contribute an integration on haystack-core-integrations
Tip
👉 Check out the full list of issues that are open to contributions
Who Uses Haystack
Here's a list of projects and companies using Haystack. Are you also using Haystack? Open a PR or tell us your story.
- Tech & AI Innovators: Apple, Meta, Databricks, NVIDIA, PostHog
- Public Sector: German Federal Ministry of Research, Technology, and Space (BMFTR), PD, Baden-Württemberg State
- Enterprise & Telecom: Alcatel-Lucent, Intel, NOS Portugal, TELUS Agriculture & Consumer Goods
- Aerospace & Hardware: Airbus, Infineon, LEGO
- Media & Entertainment: Netflix, Comcast, Zeit Online, Rakuten
- Legal & Publishing: Manz, Oxford University Press
- Startups & Research: YPulse, BetterUp, Intel Labs

