Docs: Links corrected (#14459)

This commit is contained in:
Shilpa Vernekar 2023-12-20 18:20:33 +05:30 committed by GitHub
parent 4ebe34363f
commit 1dc79bfd3c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
62 changed files with 179 additions and 148 deletions

View File

@ -784,9 +784,10 @@ You can learn more about how to ingest lineage [here](/connectors/ingestion/work
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/athena/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -499,9 +499,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/azuresql/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -502,9 +502,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/db2/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -282,9 +282,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/deltalake/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -272,9 +272,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/domo-database/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -287,9 +287,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/dynamodb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -521,9 +521,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/hive/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -480,9 +480,10 @@ link="/connectors/ingestion/workflows/dbt" /%}
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/impala/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -27,8 +27,8 @@ This is the supported list of connectors for Database Services:
- [Postgres](/connectors/database/postgres)
- [Presto](/connectors/database/presto)
- [Redshift](/connectors/database/redshift)
- [Sap Hana](/connectors/database/saphana)
- [Salesforce](/connectors/database/salesforce)
- [Sap Hana](/connectors/database/sap-hana)
- [SingleStore](/connectors/database/singlestore)
- [Snowflake](/connectors/database/snowflake)
- [Trino](/connectors/database/trino)

View File

@ -482,9 +482,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mariadb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -233,9 +233,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mongodb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -605,9 +605,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mysql/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -526,9 +526,10 @@ You can learn more about how to ingest lineage [here](/connectors/ingestion/work
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/oracle/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -524,9 +524,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/pinotdb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -484,9 +484,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/presto/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -257,9 +257,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/salesforce/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -513,9 +513,10 @@ Note how instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/sap-hana/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -476,9 +476,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/singlestore/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -559,9 +559,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/trino/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -521,9 +521,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/vertica/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -52,6 +52,7 @@ the following docs to run the Ingestion Framework in any orchestrator externally
- [PinotDB](/connectors/database/pinotdb)
- [Redshift](/connectors/database/redshift)
- [Salesforce](/connectors/database/salesforce)
- [Sap Hana](/connectors/database/sap-hana)
- [SingleStore](/connectors/database/singlestore)
- [Snowflake](/connectors/database/snowflake)
- [SQLite](/connectors/database/sqlite)

View File

@ -62,8 +62,8 @@ While the endpoints are directly defined in the `IngestionPipelineResource`, the
that decouples how OpenMetadata communicates with the Orchestrator, as different external systems will need different
calls and data to be sent.
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/util/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/airflow/AirflowRESTClient.java).
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/java/org/openmetadata/sdk/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/clients/pipeline/airflow/AirflowRESTClient.java).
The clients that implement the abstractions from the `PipelineServiceClient` are merely a translation layer between the
information received in the shape of an `IngestionPipeline` Entity, and the specific requirements of each Orchestrator.

View File

@ -119,12 +119,12 @@ as well). You might also need to validate if the query logs are available in the
You can check the queries being used here:
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L428)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L197)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L350)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L18)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L376)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L467)
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/bigquery/queries.py)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/snowflake/queries.py)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/mssql/queries.py)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/redshift/queries.py)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/clickhouse/queries.py)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/postgres/queries.py)
By default, we apply a result limit of 1000 records. You might also need to increase that for databases with big volumes
of queries.

View File

@ -41,7 +41,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)
### 1. Define the YAML Config

View File

@ -56,7 +56,7 @@ pip3 install "openmetadata-ingestion[sagemaker]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/mlmodel/sagemakerConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/mlmodel/sageMakerConnection.json)
you can find the structure to create a connection to Sagemaker.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -64,7 +64,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)
### 1. Define the YAML Config

View File

@ -338,9 +338,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/athena/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -237,7 +237,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -367,7 +367,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -11,7 +11,7 @@ OpenMetadata has support for Google SSO, Okta SSO, custom OIDC, Auth0, Azure SSO
Enabling Security is only required for your **Production** installation. If you are testing OpenMetadata, it will be easier
and faster to set up without security. To get up and running quickly with OpenMetadata (without security),
please follow the [Quickstart](/quick-start/local-deployment) guide.
please follow the [Quickstart](/quick-start) guide.
{%inlineCalloutContainer%}
{%inlineCallout

View File

@ -66,5 +66,5 @@ airflowConfiguration:
metadataApiEndpoint: ${SERVER_HOST_API_URL:-http://localhost:8585/api}
```
**Note:** Follow [this](/how-to-guides/feature-configurations/bots) guide to configure the `ingestion-bot` credentials for
**Note:** Follow [this](/how-to-guides/quick-start-guide-for-admins/bots) guide to configure the `ingestion-bot` credentials for
ingesting data from Airflow.

View File

@ -74,7 +74,7 @@ AUTHENTICATION_CLIENT_ID={CLIENT_ID - SPA APP} # Update with your Client ID
AUTHENTICATION_CALLBACK_URL=http://localhost:8585/callback
```
**Note:** Follow [this](/how-to-guides/feature-configurations/bots) guide to configure the `ingestion-bot` credentials for
**Note:** Follow [this](/how-to-guides/quick-start-guide-for-admins/bots) guide to configure the `ingestion-bot` credentials for
ingesting data from Airflow.
## 2. Start Docker

View File

@ -254,7 +254,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -187,7 +187,7 @@ For more information, visit the kubectl logs command line reference documentatio
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -34,7 +34,7 @@ alt="tour" /%}
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -41,6 +41,8 @@ Configure and schedule Unity Catalog metadata workflow from the OpenMetadata UI:
{% partial file="/v1.2/connectors/external-ingestion-deployment.md" /%}
## Requirements
{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
{%/inlineCallout%}

View File

@ -33,7 +33,7 @@ Configure and schedule Athena metadata and profiler workflows from the OpenMetad
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
- [Service Name](#service-name)
- [Connection Options](#connection-options)
- [Connection Details](#connection-details)
- [Metadata Ingestion Options](#metadata-ingestion-options)
- [Troubleshooting](#troubleshooting)
- [Workflow Deployment Error](#workflow-deployment-error)

View File

@ -207,9 +207,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/azuresql/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -47,6 +47,8 @@ Configure and schedule BigQuery metadata and profiler workflows from the OpenMet
To deploy OpenMetadata, check the Deployment guides.
{%/inlineCallout%}
## Requirements
### Data Catalog API Permissions
- Go to [https://console.cloud.google.com/apis/library/datacatalog.googleapis.com](https://console.cloud.google.com/apis/library/datacatalog.googleapis.com)

View File

@ -98,6 +98,7 @@ link="/connectors/database/bigquery/roles"
/ %}
{% /tilesContainer %}
## Metadata Ingestion
### 1. Define the YAML Config

View File

@ -156,9 +156,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mongodb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -30,7 +30,6 @@ In this section, we provide guides and references to use the Databricks connecto
Configure and schedule Databricks metadata and profiler workflows from the OpenMetadata UI:
- [Requirements](#requirements)
- [Unity Catalog](#unity-catalog)
- [Metadata Ingestion](#metadata-ingestion)
- [Query Usage](/connectors/ingestion/workflows/usage)

View File

@ -31,9 +31,9 @@ In this section, we provide guides and references to use the Impala connector.
Configure and schedule Impala metadata and profiler workflows from the OpenMetadata UI:
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
- [Data Profiler](#data-profiler)
- [Data Quality](#data-quality)
- [dbt Integration](#dbt-integration)
- [Data Profiler](/connectors/ingestion/workflows/profiler)
- [Data Quality](/connectors/ingestion/workflows/data-quality)
- [dbt Integration](/connectors/ingestion/workflows/dbt)
{% partial file="/v1.3/connectors/ingestion-modes-tiles.md" variables={yamlPath: "/connectors/database/impala/yaml"} /%}

View File

@ -195,9 +195,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/presto/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -41,6 +41,8 @@ Configure and schedule Unity Catalog metadata workflow from the OpenMetadata UI:
{% partial file="/v1.3/connectors/external-ingestion-deployment.md" /%}
## Requirements
{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
{%/inlineCallout%}

View File

@ -33,9 +33,9 @@ Configure and schedule Vertica metadata and profiler workflows from the OpenMeta
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
- [Data Profiler](#data-profiler)
- [Data Quality](#data-quality)
- [dbt Integration](#dbt-integration)
- [Data Profiler](/connectors/ingestion/workflows/profiler)
- [Data Quality](/connectors/ingestion/workflows/data-quality)
- [dbt Integration](/connectors/ingestion/workflows/dbt)
{% partial file="/v1.3/connectors/ingestion-modes-tiles.md" variables={yamlPath: "/connectors/database/vertica/yaml"} /%}

View File

@ -62,8 +62,8 @@ While the endpoints are directly defined in the `IngestionPipelineResource`, the
that decouples how OpenMetadata communicates with the Orchestrator, as different external systems will need different
calls and data to be sent.
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/util/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/airflow/AirflowRESTClient.java).
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/java/org/openmetadata/sdk/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/clients/pipeline/airflow/AirflowRESTClient.java).
The clients that implement the abstractions from the `PipelineServiceClient` are merely a translation layer between the
information received in the shape of an `IngestionPipeline` Entity, and the specific requirements of each Orchestrator.
@ -284,7 +284,7 @@ pipelineServiceClient.deployPipeline(ingestionPipeline);
```
Then, the actual deployment logic is handled by the class implementing the Pipeline Service Client. For this example,
it will be the [AirflowRESTClient](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/airflow/AirflowRESTClient.java).
it will be the [AirflowRESTClient](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/clients/pipeline/airflow/AirflowRESTClient.java).
First, let's see what it is needed to instantiate the Airflow REST Client:

View File

@ -119,12 +119,12 @@ as well). You might also need to validate if the query logs are available in the
You can check the queries being used here:
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L428)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L197)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L350)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L18)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L376)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L467)
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/bigquery/queries.py)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/snowflake/queries.py)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/mssql/queries.py)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/redshift/queries.py)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/clickhouse/queries.py)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/postgres/queries.py)
By default, we apply a result limit of 1000 records. You might also need to increase that for databases with big volumes
of queries.

View File

@ -41,7 +41,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)
### 1. Define the YAML Config

View File

@ -64,7 +64,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)
### 1. Define the YAML Config

View File

@ -14,7 +14,7 @@ Configure and schedule Dagster metadata and profiler workflows from the OpenMeta
- [Dagster Versions](#dagster-versions)
- [Metadata Ingestion](#metadata-ingestion)
- [Service Name](#service-name)
- [Connection Options](#connection-options)
- [Connection Details](#connection-details)
- [Metadata Ingestion Options](#metadata-ingestion-options)
- [Troubleshooting](#troubleshooting)
- [Workflow Deployment Error](#workflow-deployment-error)

View File

@ -299,9 +299,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/athena/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -107,7 +107,7 @@ wget https://github.com/open-metadata/OpenMetadata/releases/download/1.2.2-relea
### 3. Update Environment Variables required for OpenMetadata Dependencies
In the previous [step](#2-download-docker-compose-file-from-github-release-branch), we download the `docker-compose` file.
In the previous [step](#2-download-docker-compose-file-from-github-releases), we download the `docker-compose` file.
Identify and update the environment variables in the file to prepare openmetadata configurations.
@ -367,7 +367,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -11,7 +11,7 @@ OpenMetadata has support for Google SSO, Okta SSO, custom OIDC, Auth0, Azure SSO
Enabling Security is only required for your **Production** installation. If you are testing OpenMetadata, it will be easier
and faster to set up without security. To get up and running quickly with OpenMetadata (without security),
please follow the [Quickstart](/quick-start/local-deployment) guide.
please follow the [Quickstart](/quick-start) guide.
{%inlineCalloutContainer%}
{%inlineCallout

View File

@ -67,5 +67,5 @@ airflowConfiguration:
metadataApiEndpoint: ${SERVER_HOST_API_URL:-http://localhost:8585/api}
```
**Note:** Follow [this](/how-to-guides/feature-configurations/bots) guide to configure the `ingestion-bot` credentials for
**Note:** Follow [this](/developers/bots) guide to configure the `ingestion-bot` credentials for
ingesting data from Airflow.

View File

@ -74,7 +74,7 @@ AUTHENTICATION_CLIENT_ID={CLIENT_ID - SPA APP} # Update with your Client ID
AUTHENTICATION_CALLBACK_URL=http://localhost:8585/callback
```
**Note:** Follow [this](/how-to-guides/feature-configurations/bots) guide to configure the `ingestion-bot` credentials for
**Note:** Follow [this](/developers/bots) guide to configure the `ingestion-bot` credentials for
ingesting data from Airflow.
## 2. Start Docker

View File

@ -108,7 +108,7 @@ openmetadata:
- global: Additional property airflow is not allowed
```
This means the values passed to the helm charts has a section `global.airflow`. As per the breaking changes mentioned [here](/deployment/upgrade/versions/013-to-100#airflow-configuration-&-pipeline-service-client), Airflow configs are replaced with pipelineServiceClient for Helm Charts.
This means the values passed to the helm charts has a section `global.airflow`. As per the breaking changes mentioned [here](/deployment/upgrade/versions/100-to-110#pipeline-service-client-configuration), Airflow configs are replaced with pipelineServiceClient for Helm Charts.
The Helm Chart Values JSON Schema helps to catch the above breaking changes and this section will help you resolve and update your configurations for the same. You can read more about JSON Schema with Helm Charts [here](https://helm.sh/docs/topics/charts/#schema-files).

View File

@ -172,6 +172,6 @@ Admin users can create, edit, or delete services. They can also view the connect
color="violet-70"
bold="Delete a Service Connection"
icon="MdArrowForward"
href="/how-to-guides/admin-guide/how-to-ingest-metadata/delete-service-connection"%}
href="/how-to-guides/admin-guide/delete-service-connection"%}
Permanently delete a service connection.
{%/inlineCallout%}

View File

@ -87,7 +87,7 @@ OpenMetadata is a complete package for data teams to break down team silos, shar
- Enhance organizational **[Data Culture](/how-to-guides/data-insights)** to gain crucial insights to drive innovation.
- Define your **[Glossary](/how-to-guides/data-governance/glossary-classification)** to build a common understanding of terms within your organization.
- Define your **[Glossary](/how-to-guides/data-governance/glossary)** to build a common understanding of terms within your organization.
- Implement **[Data Governance](/how-to-guides/data-governance)** to maintain data integrity, security, and compliance.

View File

@ -36,7 +36,7 @@ You can view all the tags in the right panel.
Data assets can also be classified using Tiers. Learn more about [Tiers](/how-to-guides/data-governance/classification/tiers).
Among the Classification Tags, OpenMetadata has some System Classification. Learn more about the [System Tags](/how-to-guides/data-governance/classification/classification).
Among the Classification Tags, OpenMetadata has some System Classification. Learn more about the [System Tags](/how-to-guides/data-governance/classification/overview#classification-in-openmetadata).
## Auto-Classification in OpenMetadata

View File

@ -254,7 +254,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -187,7 +187,7 @@ For more information, visit the kubectl logs command line reference documentatio
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -34,7 +34,7 @@ alt="tour" /%}
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.