9.7 KiB
title | slug |
---|---|
Run PowerBI Connector using Airflow SDK | /connectors/dashboard/powerbi/airflow |
Run PowerBI using the Airflow SDK
In this section, we provide guides and references to use the PowerBI connector.
Configure and schedule PowerBI metadata and profiler workflows from the OpenMetadata UI:
Requirements
To deploy OpenMetadata, check the Deployment guides.To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.
Python Requirements
To run the PowerBI ingestion, you will need to install:
pip3 install "openmetadata-ingestion[powerbi]"
Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to PowerBI.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema
1. Define the YAML Config
This is a sample config for PowerBI:
source:
type: powerbi
serviceName: local_powerbi
serviceConnection:
config:
clientId: clientId
clientSecret: secret
tenantId: tenant
# scope:
# - https://analysis.windows.net/powerbi/api/.default (default)
# authorityURI: https://login.microsoftonline.com/ (default)
# hostPort: https://analysis.windows.net/powerbi (default)
# pagination_entity_per_page: 100 (default)
# useAdminApis: true or false
type: PowerBI
sourceConfig:
config:
type: DashboardMetadata
overrideOwner: True
markDeletedDashboards: True
includeTags: True
# dbServiceNames:
# - service1
# - service2
# dashboardFilterPattern:
# includes:
# - dashboard1
# - dashboard2
# excludes:
# - dashboard3
# - dashboard4
# chartFilterPattern:
# includes:
# - chart1
# - chart2
# excludes:
# - chart3
# - chart4
sink:
type: metadata-rest
config: {}
workflowConfig:
# loggerLevel: DEBUG # DEBUG, INFO, WARN or ERROR
openMetadataServerConfig:
hostPort: <OpenMetadata host and port>
authProvider: <OpenMetadata auth provider>
Source Configuration - Service Connection
- hostPort: URL to the PowerBI instance.
- clientId: PowerBI Client ID.
- clientSecret: PowerBI Client Secret.
- tenantId: PowerBI Tenant ID.
- authorityUri: Authority URI for the service.
- scope: Service scope. By default
["https://analysis.windows.net/powerbi/api/.default"]
. - Pagination Entity Per Page: Entity Limit set here will be used to paginate the PowerBi APIs. PowerBi API do not allow more than 100 workspaces to be inputed at a time. This field sets the limit of entities used for paginating the powerbi APIs. By default 100
- Use PowerBI Admin APIs: Option for using the PowerBI admin APIs:
-
Enabled (Use PowerBI Admin APIs): Using the admin APIs will fetch the dashboard and chart metadata from all the workspaces available in the powerbi instance
-
Disabled (Use Non-Admin PowerBI APIs): Using the non-admin APIs will only fetch the dashboard and chart metadata from the workspaces that have the security group of the service principal assigned to them.
Source Configuration - Source Config
The sourceConfig
is defined here:
dbServiceNames
: Database Service Name for the creation of lineage, if the source supports it.dashboardFilterPattern
/chartFilterPattern
: Note that all of them support regex as include or exclude. E.g., "My dashboard, My dash.*, .*Dashboard".includeTags
: Set the 'Include Tags' toggle to control whether to include tags as part of metadata ingestion.markDeletedDashboards
: Set the Mark Deleted Dashboards toggle to flag dashboards as soft-deleted if they are not present anymore in the source system.
dashboardFilterPattern:
includes:
- users
- type_test
Sink Configuration
To send the metadata to OpenMetadata, it needs to be specified as type: metadata-rest
.
Workflow Configuration
The main property here is the openMetadataServerConfig
, where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: openmetadata
securityConfig:
jwtToken: '{bot_jwt_token}'
We support different security providers. You can find their definitions here. You can find the different implementation of the ingestion below.
Openmetadata JWT Auth
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: openmetadata
securityConfig:
jwtToken: '{bot_jwt_token}'
Auth0 SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: auth0
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
Azure SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: azure
securityConfig:
clientSecret: '{your_client_secret}'
authority: '{your_authority_url}'
clientId: '{your_client_id}'
scopes:
- your_scopes
Custom OIDC SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
Google SSO
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: google
securityConfig:
secretKey: '{path-to-json-creds}'
Okta SSO
workflowConfig:
openMetadataServerConfig:
hostPort: http://localhost:8585/api
authProvider: okta
securityConfig:
clientId: "{CLIENT_ID - SPA APP}"
orgURL: "{ISSUER_URL}/v1/token"
privateKey: "{public/private keypair}"
email: "{email}"
scopes:
- token
Amazon Cognito SSO
The ingestion can be configured by Enabling JWT Tokens
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: auth0
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
OneLogin SSO
Which uses Custom OIDC for the ingestion
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
KeyCloak SSO
Which uses Custom OIDC for the ingestion
workflowConfig:
openMetadataServerConfig:
hostPort: 'http://localhost:8585/api'
authProvider: custom-oidc
securityConfig:
clientId: '{your_client_id}'
secretKey: '{your_client_secret}'
domain: '{your_domain}'
2. Prepare the Ingestion DAG
Create a Python file in your Airflow DAGs directory with the following contents:
import pathlib
import yaml
from datetime import timedelta
from airflow import DAG
try:
from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
from airflow.operators.python_operator import PythonOperator
from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago
default_args = {
"owner": "user_name",
"email": ["username@org.com"],
"email_on_failure": False,
"retries": 3,
"retry_delay": timedelta(minutes=5),
"execution_timeout": timedelta(minutes=60)
}
config = """
<your YAML configuration>
"""
def metadata_ingestion_workflow():
workflow_config = yaml.safe_load(config)
workflow = Workflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
workflow.stop()
with DAG(
"sample_data",
default_args=default_args,
description="An example DAG which runs a OpenMetadata ingestion workflow",
start_date=days_ago(1),
is_paused_upon_creation=False,
schedule_interval='*/5 * * * *',
catchup=False,
) as dag:
ingest_task = PythonOperator(
task_id="ingest_using_recipe",
python_callable=metadata_ingestion_workflow,
)
Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will be able to extract metadata from different sources.