2024-09-25 10:49:44 +05:30
---
title: Run the ADLS Datalake Connector Externally
slug: /connectors/database/adls-datalake/yaml
---
{% connectorDetailsHeader
name="ADLS Datalake"
stage="PROD"
platform="OpenMetadata"
2025-03-03 14:39:14 +05:30
availableFeatures=["Metadata", "Data Profiler", "Data Quality", "Sample Data"]
2024-09-25 10:49:44 +05:30
unavailableFeatures=["Query Usage", "Lineage", "Column-level Lineage", "Owners", "dbt", "Tags", "Stored Procedures"]
/ %}
In this section, we provide guides and references to use the ADLS Datalake connector.
Configure and schedule ADLS Datalake metadata and profiler workflows from the OpenMetadata UI:
- [Requirements ](#requirements )
- [Metadata Ingestion ](#metadata-ingestion )
- [dbt Integration ](#dbt-integration )
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/external-ingestion-deployment.md" /%}
2024-09-25 10:49:44 +05:30
## Requirements
**Note:** ADLS Datalake connector supports extracting metadata from file types `JSON` , `CSV` , `TSV` & `Parquet` .
### ADLS Permissions
To extract metadata from Azure ADLS (Storage Account - StorageV2), you will need an **App Registration** with the following
permissions on the Storage Account:
2025-03-06 18:49:26 +05:30
- Storage Blob Data Reader
- Storage Queue Data Reader
2024-09-25 10:49:44 +05:30
### Python Requirements
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/python-requirements.md" /%}
2024-09-25 10:49:44 +05:30
#### Azure installation
```bash
pip3 install "openmetadata-ingestion[datalake-azure]"
```
## Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Datalake.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema.
## 1. Define the YAML Config
### This is a sample config for Datalake using Azure:
{% codePreview %}
{% codeInfoContainer %}
#### Source Configuration - Service Connection
2025-03-24 09:57:06 +05:30
{% partial file="/v1.7/connectors/yaml/common/azure-config-def.md" /%}
2024-09-25 10:49:44 +05:30
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/database/source-config-def.md" /%}
2024-09-25 10:49:44 +05:30
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/ingestion-sink-def.md" /%}
2024-09-25 10:49:44 +05:30
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/workflow-config-def.md" /%}
2024-09-25 10:49:44 +05:30
{% /codeInfoContainer %}
{% codeBlock fileName="filename.yaml" %}
```yaml {% isCodeBlock=true %}
# Datalake with Azure
source:
type: datalake
serviceName: local_datalake
serviceConnection:
config:
type: Datalake
2025-03-24 09:57:06 +05:30
configSource:
securityConfig:
2024-09-25 10:49:44 +05:30
```
2025-03-24 09:57:06 +05:30
{% partial file="/v1.7/connectors/yaml/common/azure-config.md" /%}
2024-09-25 10:49:44 +05:30
```yaml {% srNumber=9 %}
prefix: prefix
```
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/database/source-config.md" /%}
2024-09-25 10:49:44 +05:30
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/ingestion-sink.md" /%}
2024-09-25 10:49:44 +05:30
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/workflow-config.md" /%}
2024-09-25 10:49:44 +05:30
{% /codeBlock %}
{% /codePreview %}
2024-12-12 11:34:09 +05:30
{% partial file="/v1.7/connectors/yaml/ingestion-cli.md" /%}
2024-09-25 10:49:44 +05:30
## dbt Integration
You can learn more about how to ingest dbt models' definitions and their lineage [here ](/connectors/ingestion/workflows/dbt ).