Docs: Correcting GCS Composer to GCP Composer (#22429)

* Docs: Correcting GCS Composer to GCP Composer

* Docs: Correcting GCS Composer to GCP Composer

---------

Co-authored-by: “Rounak <“rounakpreet.d@deuexsolutions.com”>
This commit is contained in:
Rounak Dhillon 2025-07-17 18:23:04 +05:30 committed by GitHub
parent 049fa84330
commit 69bead8ee9
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
24 changed files with 90 additions and 90 deletions

View File

@ -14,8 +14,8 @@ site_menu:
url: /getting-started/day-1/hybrid-saas/airflow
- category: Getting Started / Day 1 / Hybrid SaaS / MWAA
url: /getting-started/day-1/hybrid-saas/mwaa
- category: Getting Started / Day 1 / Hybrid SaaS / GCS Composer
url: /getting-started/day-1/hybrid-saas/gcs-composer
- category: Getting Started / Day 1 / Hybrid SaaS / GCP Composer
url: /getting-started/day-1/hybrid-saas/gcp-composer
- category: Getting Started / Day 1 / Hybrid SaaS / GitHub Actions
url: /getting-started/day-1/hybrid-saas/github-actions
- category: Getting Started / Day 1 / Hybrid SaaS / Credentials
@ -451,8 +451,8 @@ site_menu:
url: /connectors/pipeline/airflow/configuring-lineage
- category: Connectors / Pipeline / Airflow / MWAA
url: /connectors/pipeline/airflow/mwaa
- category: Connectors / Pipeline / Airflow / GCS Composer
url: /connectors/pipeline/airflow/gcs-composer
- category: Connectors / Pipeline / Airflow / GCP Composer
url: /connectors/pipeline/airflow/gcp-composer
- category: Connectors / Pipeline / Azure Data Factory
url: /connectors/pipeline/datafactory
- category: Connectors / Pipeline / Azure Data Factory / Run Externally

View File

@ -1,9 +1,9 @@
---
title: Extract Metadata from GCS Composer
slug: /connectors/pipeline/airflow/gcs-composer
title: Extract Metadata from GCP Composer
slug: /connectors/pipeline/airflow/gcp-composer
---
# Extract Metadata from GCS Composer
# Extract Metadata from GCP Composer
## Requirements
@ -20,7 +20,7 @@ Feel free to choose whatever approach adapts best to your current architecture a
## Using the Python Operator
The most comfortable way to extract metadata out of GCS Composer is by directly creating a DAG in there
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there
that will handle the connection to the metadata database automatically and push the contents
to your OpenMetadata server.
@ -129,7 +129,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -27,8 +27,8 @@ Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
link="/deployment/ingestion/external/mwaa"
/ %}
{% tile
title="GCS Composer"
description="Run the ingestion from GCS Composer."
title="GCP Composer "
description="Run the ingestion from GCP Composer ."
link="/deployment/ingestion/external/gcs-composer"
/ %}
{% /tilesContainer %}

View File

@ -1,13 +1,13 @@
---
title: Run the ingestion from GCS Composer | Official Documentation
description: Deploy external ingestion using GCS Composer to automate metadata and quality pipelines on Google Cloud environments.
slug: /deployment/ingestion/external/gcs-composer
title: Run the ingestion from GCP Composer | Official Documentation
description: Deploy external ingestion using GCP Composer to automate metadata and quality pipelines on Google Cloud environments.
slug: /deployment/ingestion/external/gcp-composer
collate: false
---
{% partial file="/v1.7/deployment/external-ingestion.md" /%}
# Run the ingestion from GCS Composer
# Run the ingestion from GCP Composer
## Requirements
@ -19,7 +19,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
## Using the Python Operator
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
it will require you to install the packages and plugins directly on the host.
### Install the Requirements
@ -98,7 +98,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -284,9 +284,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
{% inlineCallout
color="violet-70"
icon="10k"
bold="GCS Composer"
bold="GCP Composer "
href="/deployment/ingestion/external/gcs-composer" %}
Run the ingestion process externally from GCS Composer
Run the ingestion process externally from GCP Composer
{% /inlineCallout %}
{% inlineCallout
color="violet-70"

View File

@ -1,12 +1,12 @@
---
title: Run the ingestion from GCS Composer
slug: /getting-started/day-1/hybrid-saas/gcs-composer
title: Run the ingestion from GCP Composer
slug: /getting-started/day-1/hybrid-saas/gcp-composer
collate: true
---
{% partial file="/v1.7/deployment/external-ingestion.md" /%}
# Run the ingestion from GCS Composer
# Run the ingestion from GCP Composer
## Requirements
@ -18,7 +18,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
## Using the Python Operator
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
it will require you to install the packages and plugins directly on the host.
### Install the Requirements
@ -97,7 +97,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -309,9 +309,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
{% inlineCallout
color="violet-70"
icon="10k"
bold="GCS Composer"
bold="GCP Composer "
href="/deployment/ingestion/external/gcs-composer" %}
Run the ingestion process externally from GCS Composer
Run the ingestion process externally from GCP Composer
{% /inlineCallout %}
{% inlineCallout
color="violet-70"

View File

@ -59,8 +59,8 @@ site_menu:
url: /deployment/ingestion/external/airflow
- category: Deployment / Ingestion / External / MWAA
url: /deployment/ingestion/external/mwaa
- category: Deployment / Ingestion / External / GCS Composer
url: /deployment/ingestion/external/gcs-composer
- category: Deployment / Ingestion / External / GCP Composer
url: /deployment/ingestion/external/gcp-composer
- category: Deployment / Ingestion / External / GitHub Actions
url: /deployment/ingestion/external/github-actions
- category: Deployment / Ingestion / External / Credentials
@ -641,8 +641,8 @@ site_menu:
url: /connectors/pipeline/airflow/configuring-lineage
- category: Connectors / Pipeline / Airflow / MWAA
url: /connectors/pipeline/airflow/mwaa
- category: Connectors / Pipeline / Airflow / GCS Composer
url: /connectors/pipeline/airflow/gcs-composer
- category: Connectors / Pipeline / Airflow / GCP Composer
url: /connectors/pipeline/airflow/gcp-composer
- category: Connectors / Pipeline / Dagster
url: /connectors/pipeline/dagster
- category: Connectors / Pipeline / Dagster / Run Externally

View File

@ -14,8 +14,8 @@ site_menu:
url: /getting-started/day-1/hybrid-saas/airflow
- category: Getting Started / Day 1 / Hybrid SaaS / MWAA
url: /getting-started/day-1/hybrid-saas/mwaa
- category: Getting Started / Day 1 / Hybrid SaaS / GCS Composer
url: /getting-started/day-1/hybrid-saas/gcs-composer
- category: Getting Started / Day 1 / Hybrid SaaS / GCP Composer
url: /getting-started/day-1/hybrid-saas/gcp-composer
- category: Getting Started / Day 1 / Hybrid SaaS / GitHub Actions
url: /getting-started/day-1/hybrid-saas/github-actions
- category: Getting Started / Day 1 / Hybrid SaaS / Credentials
@ -461,8 +461,8 @@ site_menu:
url: /connectors/pipeline/airflow/configuring-lineage
- category: Connectors / Pipeline / Airflow / MWAA
url: /connectors/pipeline/airflow/mwaa
- category: Connectors / Pipeline / Airflow / GCS Composer
url: /connectors/pipeline/airflow/gcs-composer
- category: Connectors / Pipeline / Airflow / GCP Composer
url: /connectors/pipeline/airflow/gcp-composer
- category: Connectors / Pipeline / Azure Data Factory
url: /connectors/pipeline/datafactory
- category: Connectors / Pipeline / Azure Data Factory / Run Externally

View File

@ -1,9 +1,9 @@
---
title: Extract Metadata from GCS Composer
slug: /connectors/pipeline/airflow/gcs-composer
title: Extract Metadata from GCP Composer
slug: /connectors/pipeline/airflow/gcp-composer
---
# Extract Metadata from GCS Composer
# Extract Metadata from GCP Composer
## Requirements
@ -20,7 +20,7 @@ Feel free to choose whatever approach adapts best to your current architecture a
## Using the Python Operator
The most comfortable way to extract metadata out of GCS Composer is by directly creating a DAG in there
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there
that will handle the connection to the metadata database automatically and push the contents
to your OpenMetadata server.
@ -129,7 +129,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -27,8 +27,8 @@ Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
link="/deployment/ingestion/external/mwaa"
/ %}
{% tile
title="GCS Composer"
description="Run the ingestion from GCS Composer."
title="GCP Composer "
description="Run the ingestion from GCP Composer ."
link="/deployment/ingestion/external/gcs-composer"
/ %}
{% /tilesContainer %}

View File

@ -1,13 +1,13 @@
---
title: Run the ingestion from GCS Composer | Official Documentation
description: Deploy external ingestion using GCS Composer to automate metadata and quality pipelines on Google Cloud environments.
slug: /deployment/ingestion/external/gcs-composer
title: Run the ingestion from GCP Composer | Official Documentation
description: Deploy external ingestion using GCP Composer to automate metadata and quality pipelines on Google Cloud environments.
slug: /deployment/ingestion/external/gcp-composer
collate: false
---
{% partial file="/v1.8/deployment/external-ingestion.md" /%}
# Run the ingestion from GCS Composer
# Run the ingestion from GCP Composer
## Requirements
@ -19,7 +19,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
## Using the Python Operator
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
it will require you to install the packages and plugins directly on the host.
### Install the Requirements
@ -98,7 +98,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -284,9 +284,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
{% inlineCallout
color="violet-70"
icon="10k"
bold="GCS Composer"
bold="GCP Composer "
href="/deployment/ingestion/external/gcs-composer" %}
Run the ingestion process externally from GCS Composer
Run the ingestion process externally from GCP Composer
{% /inlineCallout %}
{% inlineCallout
color="violet-70"

View File

@ -1,12 +1,12 @@
---
title: Run the ingestion from GCS Composer
slug: /getting-started/day-1/hybrid-saas/gcs-composer
title: Run the ingestion from GCP Composer
slug: /getting-started/day-1/hybrid-saas/gcp-composer
collate: true
---
{% partial file="/v1.8/deployment/external-ingestion.md" /%}
# Run the ingestion from GCS Composer
# Run the ingestion from GCP Composer
## Requirements
@ -18,7 +18,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
## Using the Python Operator
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
it will require you to install the packages and plugins directly on the host.
### Install the Requirements
@ -97,7 +97,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -309,9 +309,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
{% inlineCallout
color="violet-70"
icon="10k"
bold="GCS Composer"
bold="GCP Composer "
href="/deployment/ingestion/external/gcs-composer" %}
Run the ingestion process externally from GCS Composer
Run the ingestion process externally from GCP Composer
{% /inlineCallout %}
{% inlineCallout
color="violet-70"

View File

@ -65,8 +65,8 @@ site_menu:
url: /deployment/ingestion/external/airflow
- category: Deployment / Ingestion / External / MWAA
url: /deployment/ingestion/external/mwaa
- category: Deployment / Ingestion / External / GCS Composer
url: /deployment/ingestion/external/gcs-composer
- category: Deployment / Ingestion / External / GCP Composer
url: /deployment/ingestion/external/gcp-composer
- category: Deployment / Ingestion / External / GitHub Actions
url: /deployment/ingestion/external/github-actions
- category: Deployment / Ingestion / External / Credentials
@ -647,8 +647,8 @@ site_menu:
url: /connectors/pipeline/airflow/configuring-lineage
- category: Connectors / Pipeline / Airflow / MWAA
url: /connectors/pipeline/airflow/mwaa
- category: Connectors / Pipeline / Airflow / GCS Composer
url: /connectors/pipeline/airflow/gcs-composer
- category: Connectors / Pipeline / Airflow / GCP Composer
url: /connectors/pipeline/airflow/gcp-composer
- category: Connectors / Pipeline / Dagster
url: /connectors/pipeline/dagster
- category: Connectors / Pipeline / Dagster / Run Externally

View File

@ -14,8 +14,8 @@ site_menu:
url: /getting-started/day-1/hybrid-saas/airflow
- category: Getting Started / Day 1 / Hybrid SaaS / MWAA
url: /getting-started/day-1/hybrid-saas/mwaa
- category: Getting Started / Day 1 / Hybrid SaaS / GCS Composer
url: /getting-started/day-1/hybrid-saas/gcs-composer
- category: Getting Started / Day 1 / Hybrid SaaS / GCP Composer
url: /getting-started/day-1/hybrid-saas/gcp-composer
- category: Getting Started / Day 1 / Hybrid SaaS / GitHub Actions
url: /getting-started/day-1/hybrid-saas/github-actions
- category: Getting Started / Day 1 / Hybrid SaaS / Credentials
@ -461,8 +461,8 @@ site_menu:
url: /connectors/pipeline/airflow/configuring-lineage
- category: Connectors / Pipeline / Airflow / MWAA
url: /connectors/pipeline/airflow/mwaa
- category: Connectors / Pipeline / Airflow / GCS Composer
url: /connectors/pipeline/airflow/gcs-composer
- category: Connectors / Pipeline / Airflow / GCP Composer
url: /connectors/pipeline/airflow/gcp-composer
- category: Connectors / Pipeline / Azure Data Factory
url: /connectors/pipeline/datafactory
- category: Connectors / Pipeline / Azure Data Factory / Run Externally

View File

@ -1,9 +1,9 @@
---
title: Extract Metadata from GCS Composer
slug: /connectors/pipeline/airflow/gcs-composer
title: Extract Metadata from GCP Composer
slug: /connectors/pipeline/airflow/gcp-composer
---
# Extract Metadata from GCS Composer
# Extract Metadata from GCP Composer
## Requirements
@ -20,7 +20,7 @@ Feel free to choose whatever approach adapts best to your current architecture a
## Using the Python Operator
The most comfortable way to extract metadata out of GCS Composer is by directly creating a DAG in there
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there
that will handle the connection to the metadata database automatically and push the contents
to your OpenMetadata server.
@ -129,7 +129,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -27,8 +27,8 @@ Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
link="/deployment/ingestion/external/mwaa"
/ %}
{% tile
title="GCS Composer"
description="Run the ingestion from GCS Composer."
title="GCP Composer "
description="Run the ingestion from GCP Composer ."
link="/deployment/ingestion/external/gcs-composer"
/ %}
{% /tilesContainer %}

View File

@ -1,13 +1,13 @@
---
title: Run the ingestion from GCS Composer | Official Documentation
description: Deploy external ingestion using GCS Composer to automate metadata and quality pipelines on Google Cloud environments.
slug: /deployment/ingestion/external/gcs-composer
title: Run the ingestion from GCP Composer | Official Documentation
description: Deploy external ingestion using GCP Composer to automate metadata and quality pipelines on Google Cloud environments.
slug: /deployment/ingestion/external/gcp-composer
collate: false
---
{% partial file="/v1.9/deployment/external-ingestion.md" /%}
# Run the ingestion from GCS Composer
# Run the ingestion from GCP Composer
## Requirements
@ -19,7 +19,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
## Using the Python Operator
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
it will require you to install the packages and plugins directly on the host.
### Install the Requirements
@ -98,7 +98,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -284,9 +284,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
{% inlineCallout
color="violet-70"
icon="10k"
bold="GCS Composer"
bold="GCP Composer "
href="/deployment/ingestion/external/gcs-composer" %}
Run the ingestion process externally from GCS Composer
Run the ingestion process externally from GCP Composer
{% /inlineCallout %}
{% inlineCallout
color="violet-70"

View File

@ -1,12 +1,12 @@
---
title: Run the ingestion from GCS Composer
slug: /getting-started/day-1/hybrid-saas/gcs-composer
title: Run the ingestion from GCP Composer
slug: /getting-started/day-1/hybrid-saas/gcp-composer
collate: true
---
{% partial file="/v1.9/deployment/external-ingestion.md" /%}
# Run the ingestion from GCS Composer
# Run the ingestion from GCP Composer
## Requirements
@ -18,7 +18,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
## Using the Python Operator
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
it will require you to install the packages and plugins directly on the host.
### Install the Requirements
@ -97,7 +97,7 @@ with DAG(
## Using the Kubernetes Pod Operator
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created

View File

@ -309,9 +309,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
{% inlineCallout
color="violet-70"
icon="10k"
bold="GCS Composer"
bold="GCP Composer "
href="/deployment/ingestion/external/gcs-composer" %}
Run the ingestion process externally from GCS Composer
Run the ingestion process externally from GCP Composer
{% /inlineCallout %}
{% inlineCallout
color="violet-70"

View File

@ -65,8 +65,8 @@ site_menu:
url: /deployment/ingestion/external/airflow
- category: Deployment / Ingestion / External / MWAA
url: /deployment/ingestion/external/mwaa
- category: Deployment / Ingestion / External / GCS Composer
url: /deployment/ingestion/external/gcs-composer
- category: Deployment / Ingestion / External / GCP Composer
url: /deployment/ingestion/external/gcp-composer
- category: Deployment / Ingestion / External / GitHub Actions
url: /deployment/ingestion/external/github-actions
- category: Deployment / Ingestion / External / Credentials
@ -647,8 +647,8 @@ site_menu:
url: /connectors/pipeline/airflow/configuring-lineage
- category: Connectors / Pipeline / Airflow / MWAA
url: /connectors/pipeline/airflow/mwaa
- category: Connectors / Pipeline / Airflow / GCS Composer
url: /connectors/pipeline/airflow/gcs-composer
- category: Connectors / Pipeline / Airflow / GCP Composer
url: /connectors/pipeline/airflow/gcp-composer
- category: Connectors / Pipeline / Dagster
url: /connectors/pipeline/dagster
- category: Connectors / Pipeline / Dagster / Run Externally