
* Rename docs and clean SSO * Add connector partials * Add connector partials * Rename path
7.5 KiB
title | slug |
---|---|
Run DomoDatabase Connector using the CLI | /connectors/database/domo-database/cli |
Run Domo Database using the metadata CLI
{% multiTablesWrapper %}
Feature | Status |
---|---|
Stage | PROD |
Metadata | {% icon iconName="check" /%} |
Query Usage | {% icon iconName="cross" /%} |
Data Profiler | {% icon iconName="check" /%} |
Data Quality | {% icon iconName="check" /%} |
Lineage | Partially via Views |
DBT | {% icon iconName="check" /%} |
Supported Versions | -- |
Feature | Status |
---|---|
Lineage | Partially via Views |
Table-level | {% icon iconName="check" /%} |
Column-level | {% icon iconName="check" /%} |
{% /multiTablesWrapper %}
In this section, we provide guides and references to use the Domo Database connector.
Configure and schedule DomoDatabase metadata and profiler workflows from the OpenMetadata UI:
Requirements
{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%} To deploy OpenMetadata, check the Deployment guides. {%/inlineCallout%}
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.
Note:
For metadata ingestion, kindly make sure add alteast data
scopes to the clientId provided.
Question related to scopes, click here.
Python Requirements
To run the DomoDatabase ingestion, you will need to install:
pip3 install "openmetadata-ingestion[domo]"
Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to DomoDatbase.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema
1. Define the YAML Config
This is a sample config for DomoDatabase:
{% codePreview %}
{% codeInfoContainer %}
Source Configuration - Service Connection
{% codeInfo srNumber=1 %}
Client ID: Client ID to Connect to DOMODatabase.
{% /codeInfo %}
{% codeInfo srNumber=2 %}
Secret Token: Secret Token to Connect DOMODatabase.
{% /codeInfo %}
{% codeInfo srNumber=3 %}
Access Token: Access to Connect to DOMODatabase.
{% /codeInfo %}
{% codeInfo srNumber=4 %}
API Host: API Host to Connect to DOMODatabase instance.
{% /codeInfo %}
{% codeInfo srNumber=5 %}
SandBox Domain: Connect to SandBox Domain.
{% /codeInfo %}
{% codeInfo srNumber=6 %}
database: Optional name to give to the database in OpenMetadata. If left blank, we will use default as the database name
{% /codeInfo %}
Source Configuration - Source Config
{% codeInfo srNumber=7 %}
The sourceConfig
is defined here:
markDeletedTables: To flag tables as soft-deleted if they are not present anymore in the source system.
includeTables: true or false, to ingest table data. Default is true.
includeViews: true or false, to ingest views definitions.
databaseFilterPattern, schemaFilterPattern, tableFilternPattern: Note that the filter supports regex as include or exclude. You can find examples here
{% /codeInfo %}
Sink Configuration
{% codeInfo srNumber=8 %}
To send the metadata to OpenMetadata, it needs to be specified as type: metadata-rest
.
{% /codeInfo %}
{% partial file="workflow-config.md" /%}
Advanced Configuration
{% codeInfo srNumber=10 %}
Connection Options (Optional): Enter the details for any additional connection options that can be sent to Athena during the connection. These details must be added as Key-Value pairs.
{% /codeInfo %}
{% codeInfo srNumber=11 %}
Connection Arguments (Optional): Enter the details for any additional connection arguments such as security or protocol configs that can be sent to Athena during the connection. These details must be added as Key-Value pairs.
- In case you are using Single-Sign-On (SSO) for authentication, add the
authenticator
details in the Connection Arguments as a Key-Value pair as follows:"authenticator" : "sso_login_url"
{% /codeInfo %}
{% /codeInfoContainer %}
{% codeBlock fileName="filename.yaml" %}
source:
type: domodatabase
serviceName: local_DomoDatabase
serviceConnection:
config:
type: DomoDashboard
clientId: client-id
secretToken: secret-token
accessToken: access-token
apiHost: api.domo.com
sandboxDomain: https://<api_domo>.domo.com
# database: database
# connectionOptions:
# key: value
# connectionArguments:
# key: value
sourceConfig:
config:
type: DatabaseMetadata
markDeletedTables: true
includeTables: true
includeViews: true
# includeTags: true
# databaseFilterPattern:
# includes:
# - database1
# - database2
# excludes:
# - database3
# - database4
# schemaFilterPattern:
# includes:
# - schema1
# - schema2
# excludes:
# - schema3
# - schema4
# tableFilterPattern:
# includes:
# - users
# - type_test
# excludes:
# - table3
# - table4
sink:
type: metadata-rest
config: {}
{% partial file="workflow-config-yaml.md" /%}
{% /codeBlock %}
{% /codePreview %}
2. Run with the CLI
First, we will need to save the YAML file. Afterward, and with all requirements installed, we can run:
metadata ingest -c <path-to-yaml>
Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will be able to extract metadata from different sources.
dbt Integration
{% tilesContainer %}
{% tile icon="mediation" title="dbt Integration" description="Learn more about how to ingest dbt models' definitions and their lineage." link="/connectors/ingestion/workflows/dbt" /%}
{% /tilesContainer %}
Related
{% tilesContainer %}
{% tile title="Ingest with Airflow" description="Configure the ingestion using Airflow SDK" link="/connectors/database/domo-database/airflow" / %}
{% /tilesContainer %}