Pere Miquel Brull 34fbe5d64c
Docs - Prepare 1.7 docs and 1.8 snapshot (#20882)
* DOCS - Prepare 1.7 Release and 1.8 SNAPSHOT

* DOCS - Prepare 1.7 Release and 1.8 SNAPSHOT
2025-04-18 12:12:17 +05:30

5.2 KiB

title slug
Run the Airflow Connector Externally /connectors/pipeline/airflow/yaml

{% connectorDetailsHeader name="Airflow" stage="PROD" platform="OpenMetadata" availableFeatures=["Pipelines", "Pipeline Status", "Lineage", "Owners"] unavailableFeatures=["Tags"] / %}

In this section, we provide guides and references to use the Airflow connector.

Configure and schedule Airflow metadata and profiler workflows from the OpenMetadata UI:

{% partial file="/v1.8/connectors/external-ingestion-deployment.md" /%}

Requirements

Python Requirements

{% partial file="/v1.8/connectors/python-requirements.md" /%}

To run the Airflow ingestion, you will need to install:

pip3 install "openmetadata-ingestion[airflow]"

{% note %}

Note that this installs the same Airflow version that we ship in the Ingestion Container. If you are running the ingestion from Airflow already, you DON'T NEED to install the airflow plugin.

Instead, just run pip3 install "openmetadata-ingestion".

{% /note %}

Note: we only support officially supported Airflow versions. You can check the version list here.

Metadata Ingestion

All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Airflow.

In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.

The workflow is modeled around the following JSON Schema

1. Define the YAML Config

This is a sample config for Airflow:

{% codePreview %}

{% codeInfoContainer %}

Source Configuration - Service Connection

{% codeInfo srNumber=1 %}

connection: Airflow metadata database connection. See these docs for supported backends.

In terms of connection we support the following selections:

  • backend: Should not be used from the UI. This is only applicable when ingesting Airflow metadata locally by running the ingestion from a DAG. It will use the current Airflow SQLAlchemy connection to extract the data.
  • MySQL, Postgres, and SQLite: Pass the required credentials to reach out each of these services. We will create a connection to the pointed database and read Airflow data from there.

hostPort: URL to the Airflow instance.

{% /codeInfo %}

{% codeInfo srNumber=1 %}

numberOfStatus: Number of status we want to look back to in every ingestion (e.g., Past executions from a DAG).

{% /codeInfo %}

{% codeInfo srNumber=1 %}

connection: Airflow metadata database connection. See these docs for supported backends.

In terms of connection we support the following selections:

  • backend: Should not be used from the UI. This is only applicable when ingesting Airflow metadata locally by running the ingestion from a DAG. It will use the current Airflow SQLAlchemy connection to extract the data.
  • MySQL, Postgres, and SQLite: Pass the required credentials to reach out each of these services. We will create a connection to the pointed database and read Airflow data from there.

{% /codeInfo %}

{% partial file="/v1.8/connectors/yaml/pipeline/source-config-def.md" /%}

{% partial file="/v1.8/connectors/yaml/ingestion-sink-def.md" /%}

{% partial file="/v1.8/connectors/yaml/workflow-config-def.md" /%}

{% /codeInfoContainer %}

{% codeBlock fileName="filename.yaml" %}

source:
  type: airflow
  serviceName: airflow_source
  serviceConnection:
    config:
      type: Airflow
      hostPort: http://localhost:8080
      numberOfStatus: 10
      # Connection needs to be one of Mysql, Postgres or Sqlite
      connection:
        type: Mysql
        username: airflow_user
        authType:
          password: airflow_pass
        databaseSchema: airflow_db
        hostPort: localhost:3306
        # #
        # type: Postgres
        # username: airflow_user
        # authType:
        #   password: airflow_pass
        # database: airflow_db
        # hostPort: localhost:3306
        # #
        # type: Sqlite
        # username: airflow_user
        # password: airflow_pass
        # database: airflow_db
        # hostPort: localhost:3306
        # databaseMode: ":memory:" (optional)

{% partial file="/v1.8/connectors/yaml/pipeline/source-config.md" /%}

{% partial file="/v1.8/connectors/yaml/ingestion-sink.md" /%}

{% partial file="/v1.8/connectors/yaml/workflow-config.md" /%}

{% /codeBlock %}

{% /codePreview %}

{% partial file="/v1.8/connectors/yaml/ingestion-cli.md" /%}