Pere Miquel Brull 222a8f8984
[Docs] - SSO updates & Connectors workflow config (#12241)
* Rename docs and clean SSO

* Add connector partials

* Add connector partials

* Rename path
2023-06-30 12:25:11 +02:00

8.5 KiB

title slug
Run Kafka Connector using Airflow SDK /connectors/messaging/kafka/airflow

Run Kafka using the Airflow SDK

In this section, we provide guides and references to use the Kafka connector.

Configure and schedule Kafka metadata and profiler workflows from the OpenMetadata UI:

Requirements

{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%} To deploy OpenMetadata, check the Deployment guides. {%/inlineCallout%}

To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.

Python Requirements

To run the Kafka ingestion, you will need to install:

pip3 install "openmetadata-ingestion[kafka]"

Metadata Ingestion

All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Kafka.

In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.

The workflow is modeled around the following JSON Schema

1. Define the YAML Config

This is a sample config for Kafka:

{% codePreview %}

{% codeInfoContainer %}

Source Configuration - Service Connection

{% codeInfo srNumber=1 %}

bootstrapServers: List of brokers as comma separated values of broker host or host:port.

Example: host1:9092,host2:9092

{% /codeInfo %}

{% codeInfo srNumber=2 %}

schemaRegistryURL: URL of the Schema Registry used to ingest the schemas of the topics.

NOTE: For now, the schema will be the last version found for the schema name {topic-name}-value. An issue to improve how it currently works has been opened.

{% /codeInfo %}

{% codeInfo srNumber=3 %}

saslUsername: SASL username for use with the PLAIN and SASL-SCRAM mechanisms.

{% /codeInfo %}

{% codeInfo srNumber=4 %}

saslPassword: SASL password for use with the PLAIN and SASL-SCRAM mechanisms.

{% /codeInfo %}

{% codeInfo srNumber=5 %}

saslMechanism: SASL mechanism to use for authentication.

Supported: GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512, OAUTHBEARER.

NOTE: Despite the name only one mechanism must be configured.

{% /codeInfo %}

{% codeInfo srNumber=6 %}

basicAuthUserInfo: Schema Registry Client HTTP credentials in the form of username:password.

By default, user info is extracted from the URL if present.

{% /codeInfo %}

{% codeInfo srNumber=7 %}

consumerConfig: The accepted additional values for the consumer configuration can be found in the following link.

{% /codeInfo %}

{% codeInfo srNumber=8 %}

schemaRegistryConfig: The accepted additional values for the Schema Registry configuration can be found in the following link.

Note: To ingest the topic schema, schemaRegistryURL must be passed.

{% /codeInfo %}

Source Configuration - Source Config

{% codeInfo srNumber=9 %}

The sourceConfig is defined here:

generateSampleData: Option to turn on/off generating sample data during metadata extraction.

topicFilterPattern: Note that the topicFilterPattern supports regex as include or exclude.

{% /codeInfo %}

Sink Configuration

{% codeInfo srNumber=10 %}

To send the metadata to OpenMetadata, it needs to be specified as type: metadata-rest.

{% /codeInfo %}

{% partial file="workflow-config.md" /%}

{% /codeInfoContainer %}

{% codeBlock fileName="filename.yaml" %}

source:
  type: kafka
  serviceName: local_kafka
  serviceConnection:
    config:
      type: Kafka
      bootstrapServers: localhost:9092
      schemaRegistryURL: http://localhost:8081  # Needs to be a URI
      saslUsername: username
      saslPassword: password
      saslMechanism: PLAIN
      basicAuthUserInfo: username:password
      consumerConfig: {}
      schemaRegistryConfig: {}
  sourceConfig:
    config:
      type: MessagingMetadata
      topicFilterPattern:
        excludes:
          - _confluent.*
      # includes:
      #   - topic1
      # generateSampleData: true
sink:
  type: metadata-rest
  config: {}

{% partial file="workflow-config-yaml.md" /%}

{% /codeBlock %}

{% /codePreview %}

2. Prepare the Ingestion DAG

Create a Python file in your Airflow DAGs directory with the following contents:

{% codePreview %}

{% codeInfoContainer %}

{% codeInfo srNumber=8 %}

Import necessary modules

The Workflow class that is being imported is a part of a metadata ingestion framework, which defines a process of getting data from different sources and ingesting it into a central metadata repository.

Here we are also importing all the basic requirements to parse YAMLs, handle dates and build our DAG.

{% /codeInfo %}

{% codeInfo srNumber=9 %}

Default arguments for all tasks in the Airflow DAG.

  • Default arguments dictionary contains default arguments for tasks in the DAG, including the owner's name, email address, number of retries, retry delay, and execution timeout.

{% /codeInfo %}

{% codeInfo srNumber=10 %}

  • config: Specifies config for the metadata ingestion as we prepare above.

{% /codeInfo %}

{% codeInfo srNumber=11 %}

  • metadata_ingestion_workflow(): This code defines a function metadata_ingestion_workflow() that loads a YAML configuration, creates a Workflow object, executes the workflow, checks its status, prints the status to the console, and stops the workflow.

{% /codeInfo %}

{% codeInfo srNumber=12 %}

  • DAG: creates a DAG using the Airflow framework, and tune the DAG configurations to whatever fits with your requirements
  • For more Airflow DAGs creation details visit here.

{% /codeInfo %}

Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will be able to extract metadata from different sources.

{% /codeInfoContainer %}

{% codeBlock fileName="filename.py" %}

import pathlib
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago

try:
    from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
    from airflow.operators.python_operator import PythonOperator

default_args = {
    "owner": "user_name",
    "email": ["username@org.com"],
    "email_on_failure": False,
    "retries": 3,
    "retry_delay": timedelta(minutes=5),
    "execution_timeout": timedelta(minutes=60)
}

config = """
<your YAML configuration>
"""

def metadata_ingestion_workflow():
    workflow_config = yaml.safe_load(config)
    workflow = Workflow.create(workflow_config)
    workflow.execute()
    workflow.raise_from_status()
    workflow.print_status()
    workflow.stop()

with DAG(
    "sample_data",
    default_args=default_args,
    description="An example DAG which runs a OpenMetadata ingestion workflow",
    start_date=days_ago(1),
    is_paused_upon_creation=False,
    schedule_interval='*/5 * * * *',
    catchup=False,
) as dag:
    ingest_task = PythonOperator(
        task_id="ingest_using_recipe",
        python_callable=metadata_ingestion_workflow,
    )

{% /codeBlock %}

{% /codePreview %}