DOCS: Broken links in v1.2 corrected (#14410)

This commit is contained in:
Shilpa Vernekar 2023-12-15 22:46:00 +05:30 committed by GitHub
parent d2bc681015
commit 3d85d4734d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
23 changed files with 46 additions and 41 deletions

View File

@ -33,7 +33,7 @@ Configure and schedule Athena metadata and profiler workflows from the OpenMetad
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
- [Service Name](#service-name)
- [Connection Options](#connection-options)
- [Connection Details](#connection-details)
- [Metadata Ingestion Options](#metadata-ingestion-options)
- [Troubleshooting](#troubleshooting)
- [Workflow Deployment Error](#workflow-deployment-error)

View File

@ -207,9 +207,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/azuresql/airflow"
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -47,6 +47,7 @@ Configure and schedule BigQuery metadata and profiler workflows from the OpenMet
To deploy OpenMetadata, check the Deployment guides.
{%/inlineCallout%}
## Requirements
### Data Catalog API Permissions
- Go to [https://console.cloud.google.com/apis/library/datacatalog.googleapis.com](https://console.cloud.google.com/apis/library/datacatalog.googleapis.com)

View File

@ -98,6 +98,7 @@ link="/connectors/database/bigquery/roles"
/ %}
{% /tilesContainer %}
## Metadata Ingestion
### 1. Define the YAML Config

View File

@ -156,9 +156,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mongodb/airflow"
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}

View File

@ -30,7 +30,6 @@ In this section, we provide guides and references to use the Databricks connecto
Configure and schedule Databricks metadata and profiler workflows from the OpenMetadata UI:
- [Requirements](#requirements)
- [Unity Catalog](#unity-catalog)
- [Metadata Ingestion](#metadata-ingestion)
- [Query Usage](/connectors/ingestion/workflows/usage)

View File

@ -195,9 +195,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/presto/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}
{% /tilesContainer %}

View File

@ -62,8 +62,8 @@ While the endpoints are directly defined in the `IngestionPipelineResource`, the
that decouples how OpenMetadata communicates with the Orchestrator, as different external systems will need different
calls and data to be sent.
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/util/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/airflow/AirflowRESTClient.java).
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/java/org/openmetadata/sdk/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/clients/pipeline/airflow/AirflowRESTClient.java).
The clients that implement the abstractions from the `PipelineServiceClient` are merely a translation layer between the
information received in the shape of an `IngestionPipeline` Entity, and the specific requirements of each Orchestrator.
@ -284,7 +284,7 @@ pipelineServiceClient.deployPipeline(ingestionPipeline);
```
Then, the actual deployment logic is handled by the class implementing the Pipeline Service Client. For this example,
it will be the [AirflowRESTClient](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/airflow/AirflowRESTClient.java).
it will be the [AirflowRESTClient](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/clients/pipeline/airflow/AirflowRESTClient.java).
First, let's see what it is needed to instantiate the Airflow REST Client:

View File

@ -119,12 +119,12 @@ as well). You might also need to validate if the query logs are available in the
You can check the queries being used here:
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L428)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L197)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L350)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L18)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L376)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L467)
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/bigquery/queries.py)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/snowflake/queries.py)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/mssql/queries.py)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/redshift/queries.py)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/clickhouse/queries.py)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/postgres/queries.py)
By default, we apply a result limit of 1000 records. You might also need to increase that for databases with big volumes
of queries.

View File

@ -41,7 +41,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)
### 1. Define the YAML Config

View File

@ -56,7 +56,7 @@ pip3 install "openmetadata-ingestion[sagemaker]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/mlmodel/sagemakerConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/mlmodel/sageMakerConnection.json)
you can find the structure to create a connection to Sagemaker.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -64,7 +64,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)
### 1. Define the YAML Config

View File

@ -299,9 +299,10 @@ source:
{% tilesContainer %}
{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/athena/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}
{% /tilesContainer %}
{% /tilesContainer %}

View File

@ -367,7 +367,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -11,7 +11,7 @@ OpenMetadata has support for Google SSO, Okta SSO, custom OIDC, Auth0, Azure SSO
Enabling Security is only required for your **Production** installation. If you are testing OpenMetadata, it will be easier
and faster to set up without security. To get up and running quickly with OpenMetadata (without security),
please follow the [Quickstart](/quick-start/local-deployment) guide.
please follow the [Quickstart](/quick-start) guide.
{%inlineCalloutContainer%}
{%inlineCallout

View File

@ -67,5 +67,5 @@ airflowConfiguration:
metadataApiEndpoint: ${SERVER_HOST_API_URL:-http://localhost:8585/api}
```
**Note:** Follow [this](/how-to-guides/feature-configurations/bots) guide to configure the `ingestion-bot` credentials for
**Note:** Follow [this](/developers/bots) guide to configure the `ingestion-bot` credentials for
ingesting data from Airflow.

View File

@ -74,7 +74,7 @@ AUTHENTICATION_CLIENT_ID={CLIENT_ID - SPA APP} # Update with your Client ID
AUTHENTICATION_CALLBACK_URL=http://localhost:8585/callback
```
**Note:** Follow [this](/how-to-guides/feature-configurations/bots) guide to configure the `ingestion-bot` credentials for
**Note:** Follow [this](/developers/bots) guide to configure the `ingestion-bot` credentials for
ingesting data from Airflow.
## 2. Start Docker

View File

@ -108,7 +108,7 @@ openmetadata:
- global: Additional property airflow is not allowed
```
This means the values passed to the helm charts has a section `global.airflow`. As per the breaking changes mentioned [here](/deployment/upgrade/versions/013-to-100#airflow-configuration-&-pipeline-service-client), Airflow configs are replaced with pipelineServiceClient for Helm Charts.
This means the values passed to the helm charts has a section `global.airflow`. As per the breaking changes mentioned [here](/deployment/upgrade/versions/100-to-110#pipeline-service-client-configuration), Airflow configs are replaced with pipelineServiceClient for Helm Charts.
The Helm Chart Values JSON Schema helps to catch the above breaking changes and this section will help you resolve and update your configurations for the same. You can read more about JSON Schema with Helm Charts [here](https://helm.sh/docs/topics/charts/#schema-files).

View File

@ -172,6 +172,6 @@ Admin users can create, edit, or delete services. They can also view the connect
color="violet-70"
bold="Delete a Service Connection"
icon="MdArrowForward"
href="/how-to-guides/admin-guide/how-to-ingest-metadata/delete-service-connection"%}
href="/how-to-guides/admin-guide/delete-service-connection"%}
Permanently delete a service connection.
{%/inlineCallout%}

View File

@ -87,7 +87,7 @@ OpenMetadata is a complete package for data teams to break down team silos, shar
- Enhance organizational **[Data Culture](/how-to-guides/data-insights)** to gain crucial insights to drive innovation.
- Define your **[Glossary](/how-to-guides/data-governance/glossary-classification)** to build a common understanding of terms within your organization.
- Define your **[Glossary](/how-to-guides/data-governance/glossary)** to build a common understanding of terms within your organization.
- Implement **[Data Governance](/how-to-guides/data-governance)** to maintain data integrity, security, and compliance.

View File

@ -36,7 +36,7 @@ You can view all the tags in the right panel.
Data assets can also be classified using Tiers. Learn more about [Tiers](/how-to-guides/data-governance/classification/tiers).
Among the Classification Tags, OpenMetadata has some System Classification. Learn more about the [System Tags](/how-to-guides/data-governance/classification/classification).
Among the Classification Tags, OpenMetadata has some System Classification. Learn more about the [System Tags](/how-to-guides/data-governance/classification/overview#classification-in-openmetadata).
## Auto-Classification in OpenMetadata

View File

@ -254,7 +254,7 @@ installation.
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -187,7 +187,7 @@ For more information, visit the kubectl logs command line reference documentatio
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.

View File

@ -34,7 +34,7 @@ alt="tour" /%}
## Next Steps
1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.