The DynamoDB connector ingests metadata using the DynamoDB boto3 client.
OpenMetadata retrieves information about all tables in the AWS account, the user must have permissions to perform the `dynamodb:ListTables` operation.
Below defined policy grants the permissions to list all tables in DynamoDB:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:ListTables"
],
"Resource": "*"
}
]
}
```
For more information on Dynamodb permissions visit the [AWS DynamoDB official documentation](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/api-permissions-reference.html).
**awsAccessKeyId**: Enter your secure access key ID for your DynamoDB connection. The specified key ID should be authorized to read all databases you want to include in the metadata ingestion workflow.
{% /codeInfo %}
{% codeInfo srNumber=2 %}
**awsSecretAccessKey**: Enter the Secret Access Key (the passcode key pair to the key ID from above).
{% /codeInfo %}
{% codeInfo srNumber=3 %}
**awsSessionToken**: The AWS session token is an optional parameter. If you want, enter the details of your temporary session token.
{% /codeInfo %}
{% codeInfo srNumber=4 %}
**awsRegion**: Enter the location of the amazon cluster that your data and account are associated with.
{% /codeInfo %}
{% codeInfo srNumber=5 %}
**endPointURL**: Your DynamoDB connector will automatically determine the AWS DynamoDB endpoint URL based on the region. You may override this behavior by entering a value to the endpoint URL.
{% /codeInfo %}
{% codeInfo srNumber=6 %}
**databaseName**: Optional name to give to the database in OpenMetadata. If left blank, we will use default as the database name.
{% /codeInfo %}
#### Source Configuration - Source Config
{% codeInfo srNumber=9 %}
The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/databaseServiceMetadataPipeline.json):
**markDeletedTables**: To flag tables as soft-deleted if they are not present anymore in the source system.
**includeTables**: true or false, to ingest table data. Default is true.
**includeViews**: true or false, to ingest views definitions.
**databaseFilterPattern**, **schemaFilterPattern**, **tableFilternPattern**: Note that the filter supports regex as include or exclude. You can find examples [here](/connectors/ingestion/workflows/metadata/filter-patterns/database)
**Connection Options (Optional)**: Enter the details for any additional connection options that can be sent to Athena during the connection. These details must be added as Key-Value pairs.
{% /codeInfo %}
{% codeInfo srNumber=8 %}
**Connection Arguments (Optional)**: Enter the details for any additional connection arguments such as security or protocol configs that can be sent to Athena during the connection. These details must be added as Key-Value pairs.
- In case you are using Single-Sign-On (SSO) for authentication, add the `authenticator` details in the Connection Arguments as a Key-Value pair as follows: `"authenticator" : "sso_login_url"`
Create a Python file in your Airflow DAGs directory with the following contents:
{% codePreview %}
{% codeInfoContainer %}
{% codeInfo srNumber=12 %}
#### Import necessary modules
The `Workflow` class that is being imported is a part of a metadata ingestion framework, which defines a process of getting data from different sources and ingesting it into a central metadata repository.
Here we are also importing all the basic requirements to parse YAMLs, handle dates and build our DAG.
{% /codeInfo %}
{% codeInfo srNumber=13 %}
**Default arguments for all tasks in the Airflow DAG.**
- Default arguments dictionary contains default arguments for tasks in the DAG, including the owner's name, email address, number of retries, retry delay, and execution timeout.
{% /codeInfo %}
{% codeInfo srNumber=14 %}
- **config**: Specifies config for the metadata ingestion as we prepare above.
{% /codeInfo %}
{% codeInfo srNumber=15 %}
- **metadata_ingestion_workflow()**: This code defines a function `metadata_ingestion_workflow()` that loads a YAML configuration, creates a `Workflow` object, executes the workflow, checks its status, prints the status to the console, and stops the workflow.
{% /codeInfo %}
{% codeInfo srNumber=16 %}
- **DAG**: creates a DAG using the Airflow framework, and tune the DAG configurations to whatever fits with your requirements
- For more Airflow DAGs creation details visit [here](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dags.html#declaring-a-dag).
{% /codeInfo %}
Note that from connector to connector, this recipe will always be the same.
By updating the `YAML configuration`, you will be able to extract metadata from different sources.
{% /codeInfoContainer %}
{% codeBlock fileName="filename.py" %}
```python {% srNumber=12 %}
import pathlib
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago
try:
from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
from airflow.operators.python_operator import PythonOperator
```
```python {% srNumber=13 %}
default_args = {
"owner": "user_name",
"email": ["username@org.com"],
"email_on_failure": False,
"retries": 3,
"retry_delay": timedelta(minutes=5),
"execution_timeout": timedelta(minutes=60)
}
```
```python {% srNumber=14 %}
config = """
<yourYAMLconfiguration>
"""
```
```python {% srNumber=15 %}
def metadata_ingestion_workflow():
workflow_config = yaml.safe_load(config)
workflow = Workflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
workflow.stop()
```
```python {% srNumber=16 %}
with DAG(
"sample_data",
default_args=default_args,
description="An example DAG which runs a OpenMetadata ingestion workflow",
start_date=days_ago(1),
is_paused_upon_creation=False,
schedule_interval='*/5 * ** *',
catchup=False,
) as dag:
ingest_task = PythonOperator(
task_id="ingest_using_recipe",
python_callable=metadata_ingestion_workflow,
)
```
{% /codeBlock %}
{% /codePreview %}
## dbt Integration
{% tilesContainer %}
{% tile
icon="mediation"
title="dbt Integration"
description="Learn more about how to ingest dbt models' definitions and their lineage."
link="/connectors/ingestion/workflows/dbt" /%}
{% /tilesContainer %}
## Related
{% tilesContainer %}
{% tile
title="Ingest with the CLI"
description="Run a one-time ingestion using the metadata CLI"