--- title: Run the Airbyte Connector Externally slug: /connectors/pipeline/airbyte/yaml --- # Run the Airbyte Connector Externally In this section, we provide guides and references to use the Airbyte connector. Configure and schedule Airbyte metadata and profiler workflows from the OpenMetadata UI: - [Requirements](#requirements) - [Metadata Ingestion](#metadata-ingestion) {% partial file="/v1.1/connectors/external-ingestion-deployment.md" /%} ## Requirements {%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%} To deploy OpenMetadata, check the Deployment guides. {% /inlineCallout %} ### Python Requirements To run the Airbyte ingestion, you will need to install: ```bash pip3 install "openmetadata-ingestion[airbyte]" ``` ## Metadata Ingestion All connectors are defined as JSON Schemas. [Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/pipeline/airbyteConnection.json) you can find the structure to create a connection to Airbyte. In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server. The workflow is modeled around the following [JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json) ### 1. Define the YAML Config This is a sample config for Airbyte: {% codePreview %} {% codeInfoContainer %} #### Source Configuration - Service Connection {% codeInfo srNumber=1 %} **hostPort**: Pipeline Service Management UI URL {% /codeInfo %} #### Source Configuration - Source Config {% codeInfo srNumber=2 %} The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/pipelineServiceMetadataPipeline.json): **dbServiceNames**: Database Service Name for the creation of lineage, if the source supports it. **includeTags**: Set the 'Include Tags' toggle to control whether to include tags as part of metadata ingestion. **markDeletedPipelines**: Set the Mark Deleted Pipelines toggle to flag pipelines as soft-deleted if they are not present anymore in the source system. **pipelineFilterPattern** and **chartFilterPattern**: Note that the `pipelineFilterPattern` and `chartFilterPattern` both support regex as include or exclude. {% /codeInfo %} #### Sink Configuration {% codeInfo srNumber=3 %} To send the metadata to OpenMetadata, it needs to be specified as `type: metadata-rest`. {% /codeInfo %} {% partial file="/v1.1/connectors/workflow-config.md" /%} {% /codeInfoContainer %} {% codeBlock fileName="filename.yaml" %} ```yaml source: type: airbyte serviceName: airbyte_source serviceConnection: config: type: Airbyte ``` ```yaml {% srNumber=1 %} hostPort: http://localhost:8000 ``` ```yaml {% srNumber=2 %} sourceConfig: config: type: PipelineMetadata # markDeletedPipelines: True # includeTags: True # includeLineage: true # pipelineFilterPattern: # includes: # - pipeline1 # - pipeline2 # excludes: # - pipeline3 # - pipeline4 ``` ```yaml {% srNumber=3 %} sink: type: metadata-rest config: {} ``` {% partial file="/v1.1/connectors/workflow-config-yaml.md" /%} {% /codeBlock %} {% /codePreview %} ### 2. Run with the CLI First, we will need to save the YAML file. Afterward, and with all requirements installed, we can run: ```bash metadata ingest -c ``` Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will be able to extract metadata from different sources.