10 KiB
title | slug |
---|---|
Run DomoDatabase Connector using the CLI | /connectors/database/domo-database/cli |
Run Domo Database using the metadata CLI
Stage | Metadata | Query Usage | Data Profiler | Data Quality | Lineage | DBT | Supported Versions | |
---|---|---|---|---|---|---|---|---|
PROD | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | -- |
Lineage | Table-level | Column-level |
---|---|---|
❌ | ❌ | ❌ |
In this section, we provide guides and references to use the Domo Database connector.
Configure and schedule DomoDatabase metadata and profiler workflows from the OpenMetadata UI:
Requirements
To deploy OpenMetadata, check the Deployment guides.To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.
For metadata ingestion, kindly make sure add alteast data
scopes to the clientId provided.
Question related to scopes, click here.
Python Requirements
To run the DomoDatabase ingestion, you will need to install:
pip3 install "openmetadata-ingestion[domo]"
Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to DomoDatbase.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema
1. Define the YAML Config
This is a sample config for DomoDatabase:
source:
type: domodatabase
serviceName: local_domodatabase
serviceConnection:
config:
type: DomoDatabase
clientId: clientid
secretToken: secret-token
accessToken: access-token
apiHost: api.domo.com
sandboxDomain: https://<api_domo>.domo.com
sourceConfig:
config:
type: DatabaseMetadata
sink:
type: metadata-rest
config: {}
workflowConfig:
# loggerLevel: DEBUG # DEBUG, INFO, WARN or ERROR
openMetadataServerConfig:
hostPort: <OpenMetadata host and port>
authProvider: <OpenMetadata auth provider>
Source Configuration - Service Connection
- Client ID: Client ID to Connect to DOMO Database.
- Secret Token: Secret Token to Connect DOMO Database.
- Access Token: Access to Connect to DOMO Database.
- API Host: API Host to Connect to DOMO Database instance.
- SandBox Domain: Connect to SandBox Domain.
Source Configuration - Source Config
The sourceConfig
is defined here:
markDeletedTables
: To flag tables as soft-deleted if they are not present anymore in the source system.includeTables
: true or false, to ingest table data. Default is true.includeViews
: true or false, to ingest views definitions.databaseFilterPattern
,schemaFilterPattern
,tableFilternPattern
: Note that the they support regex as include or exclude. E.g.,
tableFilterPattern:
includes:
- users
- type_test
Sink Configuration
To send the metadata to OpenMetadata, it needs to be specified as type: metadata-rest
.
Workflow Configuration
The main property here is the openMetadataServerConfig
, where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: openmetadata
securityConfig:
jwtToken: '{bot_jwt_token}'
We support different security providers. You can find their definitions here. You can find the different implementation of the ingestion below.
Openmetadata JWT Auth
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: openmetadata
securityConfig:
jwtToken: '{bot_jwt_token}'
Auth0 SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: auth0
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
Azure SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: azure
securityConfig:
clientSecret: '{your_client_secret}'
authority: '{your_authority_url}'
clientId: '{your_client_id}'
scopes:
- your_scopes
Custom OIDC SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
```
### Google SSO
```yaml
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: google
securityConfig:
secretKey: '{path-to-json-creds}'
Okta SSO
workflowConfig:
openMetadataServerConfig:
hostPort: http://localhost:8585/api
authProvider: okta
securityConfig:
clientId: "{CLIENT_ID - SPA APP}"
orgURL: "{ISSUER_URL}/v1/token"
privateKey: "{public/private keypair}"
email: "{email}"
scopes:
- token
Amazon Cognito SSO
The ingestion can be configured by Enabling JWT Tokens
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: auth0
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
OneLogin SSO
Which uses Custom OIDC for the ingestion
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
KeyCloak SSO
Which uses Custom OIDC for the ingestion
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
2. Run with the CLI
First, we will need to save the YAML file. Afterward, and with all requirements installed, we can run:
metadata ingest -c <path-to-yaml>
Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will be able to extract metadata from different sources.
1. Define the YAML Config
This is a sample config for the profiler:
source:
type: domodatabase
serviceName: <service name>
serviceConnection:
config:
type: DomoDatabase
type: DomoDashboard
clientId: client-id
secretToken: secret-token
accessToken: access-token
apiHost: api.domo.com
sandboxDomain: https://<api_domo>.domo.com
# endPointURL: https://athena.us-east-2.amazonaws.com/
# awsSessionToken: TOKEN
s3StagingDir: s3 directory for datasource
workgroup: workgroup name
sourceConfig:
config:
type: Profiler
# generateSampleData: true
# profileSample: 85
# threadCount: 5 (default)
# databaseFilterPattern:
# includes:
# - database1
# - database2
# excludes:
# - database3
# - database4
# schemaFilterPattern:
# includes:
# - schema1
# - schema2
# excludes:
# - schema3
# - schema4
# tableFilterPattern:
# includes:
# - table1
# - table2
# excludes:
# - table3
# - table4
- ...
sink:
type: metadata-rest
config: {}
workflowConfig:
# loggerLevel: DEBUG # DEBUG, INFO, WARN or ERROR
openMetadataServerConfig:
hostPort: <OpenMetadata host and port>
authProvider: <OpenMetadata auth provider>
Source Configuration
- You can find all the definitions and types for the
serviceConnection
here. - The
sourceConfig
is defined here.
Note that the filter patterns support regex as includes or excludes. E.g.,
tableFilterPattern:
includes:
- *users$
Workflow Configuration
The same as the metadata ingestion.
2. Run with the CLI
After saving the YAML config, we will run the command the same way we did for the metadata ingestion:
metadata profile -c <path-to-yaml>
Note how instead of running ingest
, we are using the profile
command to select the Profiler workflow.