| **Mandatory init variables** | "api_type": The type of Hugging Face API to use <br /> <br />"api_params": A dictionary with one of the following keys: <br /> <br />- `model`: Hugging Face model ID. Required when `api_type` is `SERVERLESS_INFERENCE_API`.**OR** - `url`: URL of the inference endpoint. Required when `api_type` is `INFERENCE_ENDPOINTS` or `TEXT_EMBEDDINGS_INFERENCE`. <br /> <br />"token": The Hugging Face API token. Can be set with `HF_API_TOKEN` or `HF_TOKEN` env var. |
| **Mandatory run variables** | “documents”: A list of documents to be embedded |
| **Output variables** | “documents”: A list of documents to be embedded (enriched with embeddings) |
The component uses a`HF_API_TOKEN`environment variable by default. Otherwise, you can pass a Hugging Face API token at initialization with`token`– see code examples below.
For more fine-grained details, refer to the component’s [API reference](/reference/embedders-api#huggingfaceapidocumentembedder).
### On its own
#### Using Free Serverless Inference API
Formerly known as (free) Hugging Face Inference API, this API allows you to quickly experiment with many models hosted on the Hugging Face Hub, offloading the inference to Hugging Face servers. It’s rate-limited and not meant for production.
#### Using Self-Hosted Text Embeddings Inference (TEI)
[Hugging Face Text Embeddings Inference](https://github.com/huggingface/text-embeddings-inference) is a toolkit for efficiently deploying and serving text embedding models.
While it powers the most recent versions of Serverless Inference API and Inference Endpoints, it can be used easily on-premise through Docker.
For example, you can run a TEI container as follows:
```shell
model=BAAI/bge-large-en-v1.5
revision=refs/pr/5
volume=$PWD/data # share a volume with the Docker container to avoid downloading weights every run
docker run --gpus all -p 8080:80 -v $volume:/data --pull always ghcr.io/huggingface/text-embeddings-inference:1.2 --model-id $model --revision $revision
```
For more information, refer to the [official TEI repository](https://github.com/huggingface/text-embeddings-inference).