
* Doc: SEO Title Updation * Doc: Data Discovery Gif Updation * Doc: SEO title updation * Doc: SEO Title Updation --------- Co-authored-by: “Rounak <“rounakpreet.d@deuexsolutions.com”>
2.7 KiB
title | slug |
---|---|
Airflow Connector | OpenMetadata Workflow Orchestration | /connectors/pipeline/airflow |
{% connectorDetailsHeader name="Airflow" stage="PROD" platform="OpenMetadata" availableFeatures=["Pipelines", "Pipeline Status", "Lineage", "Owners", "Usage"] unavailableFeatures=["Tags"] / %}
In this section, we provide guides and references to use the Airflow connector.
Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
{% partial file="/v1.7/connectors/ingestion-modes-tiles.md" variables={yamlPath: "/connectors/pipeline/airflow/yaml"} /%}
{% tilesContainer %} {% tile title="MWAA" description="Run the ingestion framework externally!" link="/deployment/ingestion/external/mwaa" / %} {% tile title="GCS Composer" description="Run the ingestion from GCS Composer." link="/deployment/ingestion/external/gcs-composer" / %} {% /tilesContainer %}
Requirements
{% note %} We only support officially supported Airflow versions. You can check the version list here. {% /note %}
Metadata Ingestion
{% partial file="/v1.7/connectors/metadata-ingestion-ui.md" variables={ connector: "Airflow", selectServicePath: "/images/v1.7/connectors/airflow/select-service.png", addNewServicePath: "/images/v1.7/connectors/airflow/add-new-service.png", serviceConnectionPath: "/images/v1.7/connectors/airflow/service-connection.png", } /%}
{% stepsContainer %} {% extraContent parentTagName="stepsContainer" %}
Connection Details
- Host and Port: URL to the Airflow instance.
- Number of Status: Number of status we want to look back to in every ingestion (e.g., Past executions from a DAG).
- Connection: Airflow metadata database connection. See these docs for supported backends.
In terms of connection
we support the following selections:
backend
: Should not be used from the UI. This is only applicable when ingesting Airflow metadata locally by running the ingestion from a DAG. It will use the current Airflow SQLAlchemy connection to extract the data.MySQL
,Postgres
, andSQLite
: Pass the required credentials to reach out each of these services. We will create a connection to the pointed database and read Airflow data from there.
{% /extraContent %}
{% partial file="/v1.7/connectors/test-connection.md" /%}
{% partial file="/v1.7/connectors/pipeline/configure-ingestion.md" /%}
{% partial file="/v1.7/connectors/ingestion-schedule-and-deploy.md" /%}
{% /stepsContainer %}