mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-11-02 03:29:03 +00:00
* GitBook: [#177] Documentation Update - Airflow * GitBook: [#195] Removing Cron from databaseServices * GitBook: [#196] Added trino * GitBook: [#197] removed cron from config * GitBook: [#198] Added Redash Documentation * GitBook: [#199] Added Bigquery Usage Documentation * GitBook: [#200] Added page link for presto * GitBook: [#201] Added Local Docker documentation * GitBook: [#202] Added Documentation for Local Docker Setup * GitBook: [#203] Added Git Command to clone Openmetadata in docs * GitBook: [#207] links update * GitBook: [#208] Updating Airflow Documentation * GitBook: [#210] Adding Python installation package under Airflow Lineage config * GitBook: [#211] Change the links to 0.5..0 * GitBook: [#213] Move buried connectors page up * GitBook: [#214] Update to connectors page * GitBook: [#215] Removed sub-categories * GitBook: [#212] Adding Discovery tutorial * GitBook: [#220] Updated steps to H2s. * GitBook: [#230] Complex queries * GitBook: [#231] Add lineage to feature overview * GitBook: [#232] Make feature overview headers verbs instead of nouns * GitBook: [#233] Add data reliability to features overview * GitBook: [#234] Add complex data types to feature overview * GitBook: [#235] Simplify and further distinguish discovery feature headers * GitBook: [#236] Add data importance to feature overview * GitBook: [#237] Break Connectors into its own section * GitBook: [#238] Reorganize first section of docs. * GitBook: [#239] Add connectors to feature overview * GitBook: [#240] Organize layout of feature overview into feature categories as agreed with Harsha. * GitBook: [#242] Make overview paragraph more descriptive. * GitBook: [#243] Create a link to Connectors section from feature overview. * GitBook: [#244] Add "discover data through association" to feature overview. * GitBook: [#245] Update importance and owners gifs * GitBook: [#246] Include a little more descriptive documentation for key features. * GitBook: [#248] Small tweaks to intro paragraph. * GitBook: [#249] Clean up data profiler paragraph. * GitBook: [#250] Promote Complex Data Types to its own feature. * GitBook: [#251] Update to advanced search * GitBook: [#252] Update Roadmap * GitBook: [#254] Remove old features page (text and screenshot based). * GitBook: [#255] Remove references to removed page. * GitBook: [#256] Add Descriptions and Tags section to feature overview. * GitBook: [#257] Update title for "Know Your Data" Co-authored-by: Ayush Shah <ayush.shah@deuexsolutions.com> Co-authored-by: Suresh Srinivas <suresh@getcollate.io> Co-authored-by: Shannon Bradshaw <shannon.bradshaw@arrikto.com> Co-authored-by: OpenMetadata <github@harsha.io>
2.4 KiB
2.4 KiB
| description |
|---|
| Below is the sample code example you can refer to integrate with Airflow |
Run Metadata Ingestion
Airflow Example for Sample Data
import pathlib
import json
from datetime import timedelta
from airflow import DAG
try:
from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
from airflow.operators.python_operator import PythonOperator
from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago
default_args = {
"owner": "user_name",
"email": ["username@org.com"],
"email_on_failure": False,
"retries": 3,
"retry_delay": timedelta(minutes=5),
"execution_timeout": timedelta(minutes=60)
}
config = """
{
"source": {
"type": "sample-data",
"config": {
"sample_data_folder": "./examples/sample_data"
}
},
"sink": {
"type": "metadata-rest",
"config": {}
},
"metadata_server": {
"type": "metadata-server",
"config": {
"api_endpoint": "http://localhost:8585/api",
"auth_provider_type": "no-auth"
}
}
}
"""
def metadata_ingestion_workflow():
workflow_config = json.loads(config)
workflow = Workflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
workflow.stop()
with DAG(
"sample_data",
default_args=default_args,
description="An example DAG which runs a OpenMetadata ingestion workflow",
start_date=days_ago(1),
is_paused_upon_creation=False,
schedule_interval='*/5 * * * *',
catchup=False,
) as dag:
ingest_task = PythonOperator(
task_id="ingest_using_recipe",
python_callable=metadata_ingestion_workflow,
)
we are using a python method like below
def metadata_ingestion_workflow():
workflow_config = json.loads(config)
workflow = Workflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
workflow.stop
Create a Workflow instance and pass a sample-data configuration which will read metadata from Json files and ingest it into the OpenMetadata Server. You can customize this configuration or add different connectors please refer to our examples and refer to Connectors.