mirror of
				https://github.com/open-metadata/OpenMetadata.git
				synced 2025-11-03 20:19:31 +00:00 
			
		
		
		
	
		
			
				
	
	
		
			299 lines
		
	
	
		
			8.6 KiB
		
	
	
	
		
			Markdown
		
	
	
	
	
	
			
		
		
	
	
			299 lines
		
	
	
		
			8.6 KiB
		
	
	
	
		
			Markdown
		
	
	
	
	
	
---
 | 
						|
title: Run Airflow Connector using the CLI
 | 
						|
slug: /connectors/pipeline/airflow/cli
 | 
						|
---
 | 
						|
 | 
						|
# Run Airflow using the metadata CLI
 | 
						|
 | 
						|
In this section, we provide guides and references to use the Airbyte connector.
 | 
						|
 | 
						|
Configure and schedule Airbyte metadata and profiler workflows from the OpenMetadata UI:
 | 
						|
- [Requirements](#requirements)
 | 
						|
- [Metadata Ingestion](#metadata-ingestion)
 | 
						|
 | 
						|
## Requirements
 | 
						|
 | 
						|
<InlineCallout color="violet-70" icon="description" bold="OpenMetadata 0.12 or later" href="/deployment">
 | 
						|
To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
 | 
						|
</InlineCallout>
 | 
						|
 | 
						|
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
 | 
						|
custom Airflow plugins to handle the workflow deployment.
 | 
						|
 | 
						|
### Python Requirements
 | 
						|
 | 
						|
To run the Airflow ingestion, you will need to install:
 | 
						|
 | 
						|
```bash
 | 
						|
pip3 install "openmetadata-ingestion[airflow]"
 | 
						|
```
 | 
						|
 | 
						|
Note that this installs the same Airflow version that we ship in the Ingestion Container, which is
 | 
						|
Airflow `2.3.3` from Release `0.12`.
 | 
						|
 | 
						|
The ingestion using Airflow version 2.3.3 as a source package has been tested against Airflow 2.3.3 and Airflow 2.2.5.
 | 
						|
 | 
						|
## Metadata Ingestion
 | 
						|
 | 
						|
All connectors are defined as JSON Schemas.
 | 
						|
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/pipeline/airbyteConnection.json)
 | 
						|
you can find the structure to create a connection to Airbyte.
 | 
						|
 | 
						|
In order to create and run a Metadata Ingestion workflow, we will follow
 | 
						|
the steps to create a YAML configuration able to connect to the source,
 | 
						|
process the Entities if needed, and reach the OpenMetadata server.
 | 
						|
 | 
						|
The workflow is modeled around the following
 | 
						|
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
 | 
						|
 | 
						|
### 1. Define the YAML Config
 | 
						|
 | 
						|
This is a sample config for Airbyte:
 | 
						|
 | 
						|
```yaml
 | 
						|
source:
 | 
						|
  type: airflow
 | 
						|
  serviceName: airflow_source
 | 
						|
  serviceConnection:
 | 
						|
    config:
 | 
						|
      type: Airflow
 | 
						|
      hostPort: http://localhost:8080
 | 
						|
      numberOfStatus: 10
 | 
						|
      # Connection needs to be one of Mysql, Postgres, Mssql or Sqlite
 | 
						|
      connection:
 | 
						|
        type: Mysql
 | 
						|
        username: airflow_user
 | 
						|
        password: airflow_pass
 | 
						|
        databaseSchema: airflow_db
 | 
						|
        hostPort: localhost:3306
 | 
						|
        # #
 | 
						|
        # type: Postgres
 | 
						|
        # username: airflow_user
 | 
						|
        # password: airflow_pass
 | 
						|
        # database: airflow_db
 | 
						|
        # hostPort: localhost:3306
 | 
						|
        # #
 | 
						|
        # type: Mssql
 | 
						|
        # username: airflow_user
 | 
						|
        # password: airflow_pass
 | 
						|
        # database: airflow_db
 | 
						|
        # hostPort: localhost:3306
 | 
						|
        # uriString: http://... (optional)
 | 
						|
        # #
 | 
						|
        # type: Sqlite
 | 
						|
        # username: airflow_user
 | 
						|
        # password: airflow_pass
 | 
						|
        # database: airflow_db
 | 
						|
        # hostPort: localhost:3306
 | 
						|
        # databaseMode: ":memory:" (optional)
 | 
						|
  sourceConfig:
 | 
						|
    config:
 | 
						|
      type: PipelineMetadata
 | 
						|
      # includeLineage: true
 | 
						|
      # pipelineFilterPattern:
 | 
						|
      #   includes:
 | 
						|
      #     - pipeline1
 | 
						|
      #     - pipeline2
 | 
						|
      #   excludes:
 | 
						|
      #     - pipeline3
 | 
						|
      #     - pipeline4
 | 
						|
sink:
 | 
						|
  type: metadata-rest
 | 
						|
  config: { }
 | 
						|
workflowConfig:
 | 
						|
  # loggerLevel: DEBUG  # DEBUG, INFO, WARN or ERROR
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: <OpenMetadata host and port>
 | 
						|
    authProvider: <OpenMetadata auth provider>
 | 
						|
```
 | 
						|
 | 
						|
#### Source Configuration - Service Connection
 | 
						|
 | 
						|
- **hostPort**: URL to the Airflow instance.
 | 
						|
- **numberOfStatus**: Number of status we want to look back to in every ingestion (e.g., Past executions from a DAG).
 | 
						|
- **connection**: Airflow metadata database connection. See
 | 
						|
  these [docs](https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html)
 | 
						|
  for supported backends.
 | 
						|
 | 
						|
In terms of `connection` we support the following selections:
 | 
						|
 | 
						|
- `backend`: Should not be used from the UI. This is only applicable when ingesting Airflow metadata locally by running
 | 
						|
  the ingestion from a DAG. It will use the current Airflow SQLAlchemy connection to extract the data.
 | 
						|
- `MySQL`, `Postgres`, `MSSQL` and `SQLite`: Pass the required credentials to reach out each of these services. We will
 | 
						|
  create a connection to the pointed database and read Airflow data from there.
 | 
						|
 | 
						|
#### Source Configuration - Source Config
 | 
						|
 | 
						|
The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/pipelineServiceMetadataPipeline.json):
 | 
						|
 | 
						|
- `dbServiceName`: Database Service Name for the creation of lineage, if the source supports it.
 | 
						|
- `pipelineFilterPattern` and `chartFilterPattern`: Note that the `pipelineFilterPattern` and `chartFilterPattern` both support regex as include or exclude. E.g.,
 | 
						|
 | 
						|
```yaml
 | 
						|
pipelineFilterPattern:
 | 
						|
  includes:
 | 
						|
    - users
 | 
						|
    - type_test
 | 
						|
```
 | 
						|
 | 
						|
#### Sink Configuration
 | 
						|
 | 
						|
To send the metadata to OpenMetadata, it needs to be specified as `type: metadata-rest`.
 | 
						|
 | 
						|
#### Workflow Configuration
 | 
						|
 | 
						|
The main property here is the `openMetadataServerConfig`, where you can define the host and security provider of your OpenMetadata installation.
 | 
						|
 | 
						|
For a simple, local installation using our docker containers, this looks like:
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: openmetadata
 | 
						|
    securityConfig:
 | 
						|
      jwtToken: '{bot_jwt_token}'
 | 
						|
```
 | 
						|
 | 
						|
We support different security providers. You can find their definitions [here](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-spec/src/main/resources/json/schema/security/client).
 | 
						|
You can find the different implementation of the ingestion below.
 | 
						|
 | 
						|
<Collapse title="Configure SSO in the Ingestion Workflows">
 | 
						|
 | 
						|
### Openmetadata JWT Auth
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: openmetadata
 | 
						|
    securityConfig:
 | 
						|
      jwtToken: '{bot_jwt_token}'
 | 
						|
```
 | 
						|
 | 
						|
### Auth0 SSO
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: auth0
 | 
						|
    securityConfig:
 | 
						|
      clientId: '{your_client_id}'
 | 
						|
      secretKey: '{your_client_secret}'
 | 
						|
      domain: '{your_domain}'
 | 
						|
```
 | 
						|
 | 
						|
### Azure SSO
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: azure
 | 
						|
    securityConfig:
 | 
						|
      clientSecret: '{your_client_secret}'
 | 
						|
      authority: '{your_authority_url}'
 | 
						|
      clientId: '{your_client_id}'
 | 
						|
      scopes:
 | 
						|
        - your_scopes
 | 
						|
```
 | 
						|
 | 
						|
### Custom OIDC SSO
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: custom-oidc
 | 
						|
    securityConfig:
 | 
						|
      clientId: '{your_client_id}'
 | 
						|
      secretKey: '{your_client_secret}'
 | 
						|
      domain: '{your_domain}'
 | 
						|
```
 | 
						|
 | 
						|
### Google SSO
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: google
 | 
						|
    securityConfig:
 | 
						|
      secretKey: '{path-to-json-creds}'
 | 
						|
```
 | 
						|
 | 
						|
### Okta SSO
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: http://localhost:8585/api
 | 
						|
    authProvider: okta
 | 
						|
    securityConfig:
 | 
						|
      clientId: "{CLIENT_ID - SPA APP}"
 | 
						|
      orgURL: "{ISSUER_URL}/v1/token"
 | 
						|
      privateKey: "{public/private keypair}"
 | 
						|
      email: "{email}"
 | 
						|
      scopes:
 | 
						|
        - token
 | 
						|
```
 | 
						|
 | 
						|
### Amazon Cognito SSO
 | 
						|
 | 
						|
The ingestion can be configured by [Enabling JWT Tokens](https://docs.open-metadata.org/deployment/security/enable-jwt-tokens)
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: auth0
 | 
						|
    securityConfig:
 | 
						|
      clientId: '{your_client_id}'
 | 
						|
      secretKey: '{your_client_secret}'
 | 
						|
      domain: '{your_domain}'
 | 
						|
```
 | 
						|
 | 
						|
### OneLogin SSO
 | 
						|
 | 
						|
Which uses Custom OIDC for the ingestion
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: custom-oidc
 | 
						|
    securityConfig:
 | 
						|
      clientId: '{your_client_id}'
 | 
						|
      secretKey: '{your_client_secret}'
 | 
						|
      domain: '{your_domain}'
 | 
						|
```
 | 
						|
 | 
						|
### KeyCloak SSO
 | 
						|
 | 
						|
Which uses Custom OIDC for the ingestion
 | 
						|
 | 
						|
```yaml
 | 
						|
workflowConfig:
 | 
						|
  openMetadataServerConfig:
 | 
						|
    hostPort: 'http://localhost:8585/api'
 | 
						|
    authProvider: custom-oidc
 | 
						|
    securityConfig:
 | 
						|
      clientId: '{your_client_id}'
 | 
						|
      secretKey: '{your_client_secret}'
 | 
						|
      domain: '{your_domain}'
 | 
						|
```
 | 
						|
 | 
						|
</Collapse>
 | 
						|
 | 
						|
### 2. Run with the CLI
 | 
						|
 | 
						|
First, we will need to save the YAML file. Afterward, and with all requirements installed, we can run:
 | 
						|
 | 
						|
```bash
 | 
						|
metadata ingest -c <path-to-yaml>
 | 
						|
```
 | 
						|
 | 
						|
Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration,
 | 
						|
you will be able to extract metadata from different sources.
 |