Pere Miquel Brull 15e1bb531a
Docs - Python requirements & metadata docker (#6790)
Docs - Python requirements & metadata docker (#6790)
2022-08-18 11:43:45 +02:00

1.5 KiB

title slug
Run Airflow Connector using the CLI /openmetadata/connectors/pipeline/airflow/cli

Note that this installs the same Airflow version that we ship in the Ingestion Container, which is Airflow 2.3.3 from Release 0.12.

The ingestion using Airflow version 2.3.3 as a source package has been tested against Airflow 2.3.3 and Airflow 2.2.5.

Source Configuration - Service Connection

  • hostPort: URL to the Airflow instance.
  • numberOfStatus: Number of status we want to look back to in every ingestion (e.g., Past executions from a DAG).
  • connection: Airflow metadata database connection. See these docs for supported backends.

In terms of connection we support the following selections:

  • backend: Should not be used from the UI. This is only applicable when ingesting Airflow metadata locally by running the ingestion from a DAG. It will use the current Airflow SQLAlchemy connection to extract the data.
  • MySQL, Postgres, MSSQL and SQLite: Pass the required credentials to reach out each of these services. We will create a connection to the pointed database and read Airflow data from there.