317 lines
8.5 KiB
Markdown
Raw Normal View History

---
title: Run Kafka Connector using Airflow SDK
slug: /connectors/messaging/kafka/airflow
---
# Run Kafka using the Airflow SDK
In this section, we provide guides and references to use the Kafka connector.
Configure and schedule Kafka metadata and profiler workflows from the OpenMetadata UI:
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
## Requirements
{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
{%/inlineCallout%}
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.
### Python Requirements
To run the Kafka ingestion, you will need to install:
```bash
pip3 install "openmetadata-ingestion[kafka]"
```
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/messaging/kafkaConnection.json)
you can find the structure to create a connection to Kafka.
In order to create and run a Metadata Ingestion workflow, we will follow
the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
### 1. Define the YAML Config
This is a sample config for Kafka:
{% codePreview %}
{% codeInfoContainer %}
#### Source Configuration - Service Connection
{% codeInfo srNumber=1 %}
**bootstrapServers**: List of brokers as comma separated values of broker `host` or `host:port`.
Example: `host1:9092,host2:9092`
{% /codeInfo %}
{% codeInfo srNumber=2 %}
**schemaRegistryURL**: URL of the Schema Registry used to ingest the schemas of the topics.
**NOTE**: For now, the schema will be the last version found for the schema name `{topic-name}-value`. An [issue](https://github.com/open-metadata/OpenMetadata/issues/10399) to improve how it currently works has been opened.
{% /codeInfo %}
{% codeInfo srNumber=3 %}
**saslUsername**: SASL username for use with the PLAIN and SASL-SCRAM mechanisms.
{% /codeInfo %}
{% codeInfo srNumber=4 %}
**saslPassword**: SASL password for use with the PLAIN and SASL-SCRAM mechanisms.
{% /codeInfo %}
{% codeInfo srNumber=5 %}
**saslMechanism**: SASL mechanism to use for authentication.
Supported: _GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512, OAUTHBEARER_.
**NOTE**: Despite the name only one mechanism must be configured.
{% /codeInfo %}
{% codeInfo srNumber=6 %}
**basicAuthUserInfo**: Schema Registry Client HTTP credentials in the form of `username:password`.
By default, user info is extracted from the URL if present.
{% /codeInfo %}
{% codeInfo srNumber=7 %}
**consumerConfig**: The accepted additional values for the consumer configuration can be found in the following
[link](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md).
{% /codeInfo %}
{% codeInfo srNumber=8 %}
**schemaRegistryConfig**: The accepted additional values for the Schema Registry configuration can be found in the
following [link](https://docs.confluent.io/5.5.1/clients/confluent-kafka-python/index.html#confluent_kafka.schema_registry.SchemaRegistryClient).
**Note:** To ingest the topic schema, `schemaRegistryURL` must be passed.
{% /codeInfo %}
#### Source Configuration - Source Config
{% codeInfo srNumber=9 %}
The sourceConfig is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/messagingServiceMetadataPipeline.json):
**generateSampleData:** Option to turn on/off generating sample data during metadata extraction.
**topicFilterPattern:** Note that the `topicFilterPattern` supports regex as include or exclude.
{% /codeInfo %}
#### Sink Configuration
{% codeInfo srNumber=10 %}
To send the metadata to OpenMetadata, it needs to be specified as `type: metadata-rest`.
{% /codeInfo %}
{% partial file="workflow-config.md" /%}
{% /codeInfoContainer %}
{% codeBlock fileName="filename.yaml" %}
```yaml
source:
type: kafka
serviceName: local_kafka
serviceConnection:
config:
type: Kafka
```
```yaml {% srNumber=1 %}
bootstrapServers: localhost:9092
```
```yaml {% srNumber=2 %}
schemaRegistryURL: http://localhost:8081 # Needs to be a URI
```
```yaml {% srNumber=3 %}
saslUsername: username
```
```yaml {% srNumber=4 %}
saslPassword: password
```
```yaml {% srNumber=5 %}
saslMechanism: PLAIN
```
```yaml {% srNumber=6 %}
basicAuthUserInfo: username:password
```
```yaml {% srNumber=7 %}
consumerConfig: {}
```
```yaml {% srNumber=8 %}
schemaRegistryConfig: {}
```
```yaml {% srNumber=9 %}
sourceConfig:
config:
type: MessagingMetadata
topicFilterPattern:
excludes:
- _confluent.*
# includes:
# - topic1
# generateSampleData: true
```
```yaml {% srNumber=10 %}
sink:
type: metadata-rest
config: {}
```
{% partial file="workflow-config-yaml.md" /%}
{% /codeBlock %}
{% /codePreview %}
### 2. Prepare the Ingestion DAG
Create a Python file in your Airflow DAGs directory with the following contents:
{% codePreview %}
{% codeInfoContainer %}
{% codeInfo srNumber=8 %}
#### Import necessary modules
The `Workflow` class that is being imported is a part of a metadata ingestion framework, which defines a process of getting data from different sources and ingesting it into a central metadata repository.
Here we are also importing all the basic requirements to parse YAMLs, handle dates and build our DAG.
{% /codeInfo %}
{% codeInfo srNumber=9 %}
**Default arguments for all tasks in the Airflow DAG.**
- Default arguments dictionary contains default arguments for tasks in the DAG, including the owner's name, email address, number of retries, retry delay, and execution timeout.
{% /codeInfo %}
{% codeInfo srNumber=10 %}
- **config**: Specifies config for the metadata ingestion as we prepare above.
{% /codeInfo %}
{% codeInfo srNumber=11 %}
- **metadata_ingestion_workflow()**: This code defines a function `metadata_ingestion_workflow()` that loads a YAML configuration, creates a `Workflow` object, executes the workflow, checks its status, prints the status to the console, and stops the workflow.
{% /codeInfo %}
{% codeInfo srNumber=12 %}
- **DAG**: creates a DAG using the Airflow framework, and tune the DAG configurations to whatever fits with your requirements
- For more Airflow DAGs creation details visit [here](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dags.html#declaring-a-dag).
{% /codeInfo %}
Note that from connector to connector, this recipe will always be the same.
By updating the `YAML configuration`, you will be able to extract metadata from different sources.
{% /codeInfoContainer %}
{% codeBlock fileName="filename.py" %}
```python {% srNumber=8 %}
import pathlib
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago
try:
from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
from airflow.operators.python_operator import PythonOperator
```
```python {% srNumber=9 %}
default_args = {
"owner": "user_name",
"email": ["username@org.com"],
"email_on_failure": False,
"retries": 3,
"retry_delay": timedelta(minutes=5),
"execution_timeout": timedelta(minutes=60)
}
```
```python {% srNumber=10 %}
config = """
<your YAML configuration>
"""
```
```python {% srNumber=11 %}
def metadata_ingestion_workflow():
workflow_config = yaml.safe_load(config)
workflow = Workflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
workflow.stop()
```
```python {% srNumber=12 %}
with DAG(
"sample_data",
default_args=default_args,
description="An example DAG which runs a OpenMetadata ingestion workflow",
start_date=days_ago(1),
is_paused_upon_creation=False,
schedule_interval='*/5 * * * *',
catchup=False,
) as dag:
ingest_task = PythonOperator(
task_id="ingest_using_recipe",
python_callable=metadata_ingestion_workflow,
)
```
{% /codeBlock %}
{% /codePreview %}