If you already have an Airflow instance up and running, you might want to reuse it to host the metadata workflows as
well. Here we will guide you on the different aspects to consider when configuring an existing Airflow.
There are three different angles here:
1. Installing the ingestion modules directly on the host to enable the [Airflow Lineage Backend](/openmetadata/connectors/pipeline/airflow/lineage-backend).
2. Installing connector modules on the host to run specific workflows.
3. Installing the Airflow APIs to enable the workflow deployment through the UI.
Depending on what you wish to use, you might just need some of these installations. Note that the installation
commands shown below need to be run in the Airflow instances.
### Airflow Lineage Backend
Goals:
- Ingest DAGs and Tasks as Pipeline Entities when they run.
- Track DAG and Task status.
- Document lineage as code directly on the DAG definition and ingest it when the DAGs run.
Get the necessary information to install and extract metadata from the Lineage Backend [here](/openmetadata/connectors/pipeline/airflow/lineage-backend).
### Connector Modules
Goal:
- Ingest metadata from specific sources.
The current approach we are following here is preparing the metadata ingestion DAGs as `PythonOperators`. This means that
the packages need to be present in the Airflow instances.
After installing the Airflow APIs, you will need to update your OpenMetadata Server.
The OpenMetadata server takes all its configurations from a YAML file. You can find them in our [repo](https://github.com/open-metadata/OpenMetadata/tree/main/conf). In
`openmetadata.yaml`, update the `airflowConfiguration` section accordingly.