In this section, we provide guides and references to use the Airflow connector.
Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
If you don't want to use the OpenMetadata Ingestion container to configure the workflows via the UI, then you can check the following docs to
extract metadata directly from your Airflow instance or via the CLI:
{% tilesContainer %}
{% tile
title="Ingest directly from your Airflow"
description="Configure the ingestion with a DAG on your own Airflow instance"
link="/connectors/pipeline/airflow/gcs"
/ %}
{% tile
title="Ingest with the CLI"
description="Run a one-time ingestion using the metadata CLI"
link="/connectors/pipeline/airflow/cli"
/ %}
{% /tilesContainer %}
## Requirements
{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
{% /inlineCallout %}
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.
**Note:** we only support officially supported Airflow versions. You can check the version list [here](https://airflow.apache.org/docs/apache-airflow/stable/installation/supported-versions.html).
## Metadata Ingestion
{% stepsContainer %}
{% step srNumber=1 %}
{% stepDescription title="1. Visit the Services Page" %}
The first step is ingesting the metadata from your sources. Under
Settings, you will find a Services link an external source system to
OpenMetadata. Once a service is created, it can be used to configure
metadata, usage, and profiler workflows.
To visit the Services page, select Services from the Settings menu.
caption="Configure the service connection by filling the form" /%}
{% /stepVisualInfo %}
{% /step %}
{% extraContent parentTagName="stepsContainer" %}
#### Connection Options
- **Host and Port**: URL to the Airflow instance.
- **Number of Status**: Number of status we want to look back to in every ingestion (e.g., Past executions from a DAG).
- **Connection**: Airflow metadata database connection. See these [docs](https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html)
for supported backends.
In terms of `connection` we support the following selections:
-`backend`: Should not be used from the UI. This is only applicable when ingesting Airflow metadata locally
by running the ingestion from a DAG. It will use the current Airflow SQLAlchemy connection to extract the data.
-`MySQL`, `Postgres`, `MSSQL` and `SQLite`: Pass the required credentials to reach out each of these services. We
will create a connection to the pointed database and read Airflow data from there.
{% /extraContent %}
{% step srNumber=6 %}
{% stepDescription title="6. Test the Connection" %}
Once the credentials have been added, click on `Test Connection` and Save
- **Name**: This field refers to the name of ingestion pipeline, you can customize the name or use the generated name.
- **Pipeline Filter Pattern (Optional)**: Use to pipeline filter patterns to control whether or not to include pipeline as part of metadata ingestion.
- **Include**: Explicitly include pipeline by adding a list of comma-separated regular expressions to the Include field. OpenMetadata will include all pipeline with names matching one or more of the supplied regular expressions. All other schemas will be excluded.
- **Exclude**: Explicitly exclude pipeline by adding a list of comma-separated regular expressions to the Exclude field. OpenMetadata will exclude all pipeline with names matching one or more of the supplied regular expressions. All other schemas will be included.
- **Include lineage (toggle)**: Set the Include lineage toggle to control whether or not to include lineage between pipelines and data sources as part of metadata ingestion.
- **Enable Debug Log (toggle)**: Set the Enable Debug Log toggle to set the default log level to debug, these logs can be viewed later in Airflow.
- **Mark Deleted Pipelines (toggle)**: Set the Mark Deleted Pipelines toggle to flag pipelines as soft-deleted if they are not present anymore in the source system.
{% /extraContent %}
{% step srNumber=8 %}
{% stepDescription title="8. Schedule the Ingestion and Deploy" %}