2022-08-30 20:17:17 +05:30
---
title: Run Dagster Connector using Airflow SDK
2022-10-05 21:54:02 -07:00
slug: /connectors/pipeline/dagster/airflow
2022-08-30 20:17:17 +05:30
---
# Run Dagster using the Airflow SDK
In this section, we provide guides and references to use the Dagster connector.
Configure and schedule Dagster metadata and profiler workflows from the OpenMetadata UI:
- [Requirements ](#requirements )
- [Metadata Ingestion ](#metadata-ingestion )
## Requirements
< InlineCallout color = "violet-70" icon = "description" bold = "OpenMetadata 0.12 or later" href = "/deployment" >
To deploy OpenMetadata, check the < a href = "/deployment" > Deployment< / a > guides.
< / InlineCallout >
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.
### Python Requirements
To run the Dagster ingestion, you will need to install:
```bash
pip3 install "openmetadata-ingestion[dagster]"
```
## Metadata Ingestion
All connectors are defined as JSON Schemas.
2022-09-23 15:09:46 -07:00
[Here ](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/pipeline/dagsterConnection.json )
2022-08-30 20:17:17 +05:30
you can find the structure to create a connection to Dagster.
In order to create and run a Metadata Ingestion workflow, we will follow
the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
2022-09-23 15:09:46 -07:00
[JSON Schema ](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json )
2022-08-30 20:17:17 +05:30
### 1. Define the YAML Config
This is a sample config for Dagster:
```yaml
source:
type: dagster
serviceName: dagster_source
serviceConnection:
config:
type: Dagster
2022-12-16 21:04:55 +05:30
host: "https://< yourorghere > .dagster.cloud/prod" # or http://127.0.0.1:3000
2022-11-22 15:19:54 +05:30
token: token
2022-08-30 20:17:17 +05:30
sourceConfig:
config:
type: PipelineMetadata
# includeLineage: true
# pipelineFilterPattern:
# includes:
# - pipeline1
# - pipeline2
# excludes:
# - pipeline3
# - pipeline4
sink:
type: metadata-rest
2022-08-31 15:11:11 +02:00
config: {}
2022-08-30 20:17:17 +05:30
workflowConfig:
2022-08-31 15:11:11 +02:00
# loggerLevel: DEBUG # DEBUG, INFO, WARN or ERROR
2022-08-30 20:17:17 +05:30
openMetadataServerConfig:
2022-10-03 14:52:32 +05:30
hostPort: < OpenMetadata host and port >
authProvider: < OpenMetadata auth provider >
2022-08-30 20:17:17 +05:30
```
#### Source Configuration - Service Connection
2022-12-16 21:04:55 +05:30
- **host**: host and port for dagster pipeline
2022-10-18 18:16:20 +05:30
< Note >
If dagster is deployed on `localhost` and entering `https://localhost:3000` into hostPort gives a connection refused error, please enter `https://127.0.0.1:3000` into the hostPort and try again.
< / Note >
2022-12-16 21:04:55 +05:30
- **ServiceConnection**
- **Host**: Host of the dagster eg.`https://localhost:300` or `https://127.0.0.1:3000` or `https://<yourorghere>.dagster.cloud/prod`
- **Token** : Need pass token if connecting to `dagster cloud` instance
2022-08-30 20:17:17 +05:30
#### Source Configuration - Source Config
2022-09-23 15:09:46 -07:00
The `sourceConfig` is defined [here ](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/pipelineServiceMetadataPipeline.json ):
2022-08-30 20:17:17 +05:30
- `dbServiceName` : Database Service Name for the creation of lineage, if the source supports it.
- `pipelineFilterPattern` and `chartFilterPattern` : Note that the `pipelineFilterPattern` and `chartFilterPattern` both support regex as include or exclude. E.g.,
```yaml
pipelineFilterPattern:
includes:
- users
- type_test
```
#### Sink Configuration
To send the metadata to OpenMetadata, it needs to be specified as `type: metadata-rest` .
#### Workflow Configuration
The main property here is the `openMetadataServerConfig` , where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
```yaml
workflowConfig:
openMetadataServerConfig:
2022-10-03 14:52:32 +05:30
hostPort: 'http://localhost:8585/api'
authProvider: openmetadata
securityConfig:
jwtToken: '{bot_jwt_token}'
2022-08-30 20:17:17 +05:30
```
2022-09-23 15:09:46 -07:00
We support different security providers. You can find their definitions [here ](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-spec/src/main/resources/json/schema/security/client ).
2022-08-30 20:17:17 +05:30
You can find the different implementation of the ingestion below.
< Collapse title = "Configure SSO in the Ingestion Workflows" >
2022-10-03 14:52:32 +05:30
### Openmetadata JWT Auth
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: openmetadata
securityConfig:
jwtToken: '{bot_jwt_token}'
```
2022-08-30 20:17:17 +05:30
### Auth0 SSO
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: auth0
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
```
### Azure SSO
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: azure
securityConfig:
clientSecret: '{your_client_secret}'
authority: '{your_authority_url}'
clientId: '{your_client_id}'
scopes:
- your_scopes
```
### Custom OIDC SSO
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
```
### Google SSO
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: google
securityConfig:
secretKey: '{path-to-json-creds}'
```
### Okta SSO
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: http://localhost:8585/api
authProvider: okta
securityConfig:
clientId: "{CLIENT_ID - SPA APP}"
orgURL: "{ISSUER_URL}/v1/token"
privateKey: "{public/private keypair}"
email: "{email}"
scopes:
- token
```
### Amazon Cognito SSO
The ingestion can be configured by [Enabling JWT Tokens ](https://docs.open-metadata.org/deployment/security/enable-jwt-tokens )
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: auth0
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
```
### OneLogin SSO
Which uses Custom OIDC for the ingestion
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
```
### KeyCloak SSO
Which uses Custom OIDC for the ingestion
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
```
< / Collapse >
## 2. Prepare the Ingestion DAG
Create a Python file in your Airflow DAGs directory with the following contents:
```python
import pathlib
import yaml
from datetime import timedelta
from airflow import DAG
try:
from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
from airflow.operators.python_operator import PythonOperator
from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago
default_args = {
"owner": "user_name",
"email": ["username@org .com"],
"email_on_failure": False,
"retries": 3,
"retry_delay": timedelta(minutes=5),
"execution_timeout": timedelta(minutes=60)
}
config = """
< your YAML configuration >
"""
def metadata_ingestion_workflow():
workflow_config = yaml.safe_load(config)
workflow = Workflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
workflow.stop()
with DAG(
"sample_data",
default_args=default_args,
description="An example DAG which runs a OpenMetadata ingestion workflow",
start_date=days_ago(1),
is_paused_upon_creation=False,
schedule_interval='*/5 * * * * ',
catchup=False,
) as dag:
ingest_task = PythonOperator(
task_id="ingest_using_recipe",
python_callable=metadata_ingestion_workflow,
)
```
Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will
2022-10-05 21:54:02 -07:00
be able to extract metadata from different sources.