mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-09-25 17:04:54 +00:00
Docs: Correcting GCS Composer to GCP Composer (#22429)
* Docs: Correcting GCS Composer to GCP Composer * Docs: Correcting GCS Composer to GCP Composer --------- Co-authored-by: “Rounak <“rounakpreet.d@deuexsolutions.com”>
This commit is contained in:
parent
049fa84330
commit
69bead8ee9
@ -14,8 +14,8 @@ site_menu:
|
||||
url: /getting-started/day-1/hybrid-saas/airflow
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / MWAA
|
||||
url: /getting-started/day-1/hybrid-saas/mwaa
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GCS Composer
|
||||
url: /getting-started/day-1/hybrid-saas/gcs-composer
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GCP Composer
|
||||
url: /getting-started/day-1/hybrid-saas/gcp-composer
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GitHub Actions
|
||||
url: /getting-started/day-1/hybrid-saas/github-actions
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / Credentials
|
||||
@ -451,8 +451,8 @@ site_menu:
|
||||
url: /connectors/pipeline/airflow/configuring-lineage
|
||||
- category: Connectors / Pipeline / Airflow / MWAA
|
||||
url: /connectors/pipeline/airflow/mwaa
|
||||
- category: Connectors / Pipeline / Airflow / GCS Composer
|
||||
url: /connectors/pipeline/airflow/gcs-composer
|
||||
- category: Connectors / Pipeline / Airflow / GCP Composer
|
||||
url: /connectors/pipeline/airflow/gcp-composer
|
||||
- category: Connectors / Pipeline / Azure Data Factory
|
||||
url: /connectors/pipeline/datafactory
|
||||
- category: Connectors / Pipeline / Azure Data Factory / Run Externally
|
||||
|
@ -1,9 +1,9 @@
|
||||
---
|
||||
title: Extract Metadata from GCS Composer
|
||||
slug: /connectors/pipeline/airflow/gcs-composer
|
||||
title: Extract Metadata from GCP Composer
|
||||
slug: /connectors/pipeline/airflow/gcp-composer
|
||||
---
|
||||
|
||||
# Extract Metadata from GCS Composer
|
||||
# Extract Metadata from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -20,7 +20,7 @@ Feel free to choose whatever approach adapts best to your current architecture a
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to extract metadata out of GCS Composer is by directly creating a DAG in there
|
||||
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there
|
||||
that will handle the connection to the metadata database automatically and push the contents
|
||||
to your OpenMetadata server.
|
||||
|
||||
@ -129,7 +129,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -27,8 +27,8 @@ Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
|
||||
link="/deployment/ingestion/external/mwaa"
|
||||
/ %}
|
||||
{% tile
|
||||
title="GCS Composer"
|
||||
description="Run the ingestion from GCS Composer."
|
||||
title="GCP Composer "
|
||||
description="Run the ingestion from GCP Composer ."
|
||||
link="/deployment/ingestion/external/gcs-composer"
|
||||
/ %}
|
||||
{% /tilesContainer %}
|
||||
|
@ -1,13 +1,13 @@
|
||||
---
|
||||
title: Run the ingestion from GCS Composer | Official Documentation
|
||||
description: Deploy external ingestion using GCS Composer to automate metadata and quality pipelines on Google Cloud environments.
|
||||
slug: /deployment/ingestion/external/gcs-composer
|
||||
title: Run the ingestion from GCP Composer | Official Documentation
|
||||
description: Deploy external ingestion using GCP Composer to automate metadata and quality pipelines on Google Cloud environments.
|
||||
slug: /deployment/ingestion/external/gcp-composer
|
||||
collate: false
|
||||
---
|
||||
|
||||
{% partial file="/v1.7/deployment/external-ingestion.md" /%}
|
||||
|
||||
# Run the ingestion from GCS Composer
|
||||
# Run the ingestion from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -19,7 +19,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
|
||||
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
|
||||
it will require you to install the packages and plugins directly on the host.
|
||||
|
||||
### Install the Requirements
|
||||
@ -98,7 +98,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -284,9 +284,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
icon="10k"
|
||||
bold="GCS Composer"
|
||||
bold="GCP Composer "
|
||||
href="/deployment/ingestion/external/gcs-composer" %}
|
||||
Run the ingestion process externally from GCS Composer
|
||||
Run the ingestion process externally from GCP Composer
|
||||
{% /inlineCallout %}
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
|
@ -1,12 +1,12 @@
|
||||
---
|
||||
title: Run the ingestion from GCS Composer
|
||||
slug: /getting-started/day-1/hybrid-saas/gcs-composer
|
||||
title: Run the ingestion from GCP Composer
|
||||
slug: /getting-started/day-1/hybrid-saas/gcp-composer
|
||||
collate: true
|
||||
---
|
||||
|
||||
{% partial file="/v1.7/deployment/external-ingestion.md" /%}
|
||||
|
||||
# Run the ingestion from GCS Composer
|
||||
# Run the ingestion from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -18,7 +18,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
|
||||
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
|
||||
it will require you to install the packages and plugins directly on the host.
|
||||
|
||||
### Install the Requirements
|
||||
@ -97,7 +97,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -309,9 +309,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
icon="10k"
|
||||
bold="GCS Composer"
|
||||
bold="GCP Composer "
|
||||
href="/deployment/ingestion/external/gcs-composer" %}
|
||||
Run the ingestion process externally from GCS Composer
|
||||
Run the ingestion process externally from GCP Composer
|
||||
{% /inlineCallout %}
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
|
@ -59,8 +59,8 @@ site_menu:
|
||||
url: /deployment/ingestion/external/airflow
|
||||
- category: Deployment / Ingestion / External / MWAA
|
||||
url: /deployment/ingestion/external/mwaa
|
||||
- category: Deployment / Ingestion / External / GCS Composer
|
||||
url: /deployment/ingestion/external/gcs-composer
|
||||
- category: Deployment / Ingestion / External / GCP Composer
|
||||
url: /deployment/ingestion/external/gcp-composer
|
||||
- category: Deployment / Ingestion / External / GitHub Actions
|
||||
url: /deployment/ingestion/external/github-actions
|
||||
- category: Deployment / Ingestion / External / Credentials
|
||||
@ -641,8 +641,8 @@ site_menu:
|
||||
url: /connectors/pipeline/airflow/configuring-lineage
|
||||
- category: Connectors / Pipeline / Airflow / MWAA
|
||||
url: /connectors/pipeline/airflow/mwaa
|
||||
- category: Connectors / Pipeline / Airflow / GCS Composer
|
||||
url: /connectors/pipeline/airflow/gcs-composer
|
||||
- category: Connectors / Pipeline / Airflow / GCP Composer
|
||||
url: /connectors/pipeline/airflow/gcp-composer
|
||||
- category: Connectors / Pipeline / Dagster
|
||||
url: /connectors/pipeline/dagster
|
||||
- category: Connectors / Pipeline / Dagster / Run Externally
|
||||
|
@ -14,8 +14,8 @@ site_menu:
|
||||
url: /getting-started/day-1/hybrid-saas/airflow
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / MWAA
|
||||
url: /getting-started/day-1/hybrid-saas/mwaa
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GCS Composer
|
||||
url: /getting-started/day-1/hybrid-saas/gcs-composer
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GCP Composer
|
||||
url: /getting-started/day-1/hybrid-saas/gcp-composer
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GitHub Actions
|
||||
url: /getting-started/day-1/hybrid-saas/github-actions
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / Credentials
|
||||
@ -461,8 +461,8 @@ site_menu:
|
||||
url: /connectors/pipeline/airflow/configuring-lineage
|
||||
- category: Connectors / Pipeline / Airflow / MWAA
|
||||
url: /connectors/pipeline/airflow/mwaa
|
||||
- category: Connectors / Pipeline / Airflow / GCS Composer
|
||||
url: /connectors/pipeline/airflow/gcs-composer
|
||||
- category: Connectors / Pipeline / Airflow / GCP Composer
|
||||
url: /connectors/pipeline/airflow/gcp-composer
|
||||
- category: Connectors / Pipeline / Azure Data Factory
|
||||
url: /connectors/pipeline/datafactory
|
||||
- category: Connectors / Pipeline / Azure Data Factory / Run Externally
|
||||
|
@ -1,9 +1,9 @@
|
||||
---
|
||||
title: Extract Metadata from GCS Composer
|
||||
slug: /connectors/pipeline/airflow/gcs-composer
|
||||
title: Extract Metadata from GCP Composer
|
||||
slug: /connectors/pipeline/airflow/gcp-composer
|
||||
---
|
||||
|
||||
# Extract Metadata from GCS Composer
|
||||
# Extract Metadata from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -20,7 +20,7 @@ Feel free to choose whatever approach adapts best to your current architecture a
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to extract metadata out of GCS Composer is by directly creating a DAG in there
|
||||
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there
|
||||
that will handle the connection to the metadata database automatically and push the contents
|
||||
to your OpenMetadata server.
|
||||
|
||||
@ -129,7 +129,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -27,8 +27,8 @@ Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
|
||||
link="/deployment/ingestion/external/mwaa"
|
||||
/ %}
|
||||
{% tile
|
||||
title="GCS Composer"
|
||||
description="Run the ingestion from GCS Composer."
|
||||
title="GCP Composer "
|
||||
description="Run the ingestion from GCP Composer ."
|
||||
link="/deployment/ingestion/external/gcs-composer"
|
||||
/ %}
|
||||
{% /tilesContainer %}
|
||||
|
@ -1,13 +1,13 @@
|
||||
---
|
||||
title: Run the ingestion from GCS Composer | Official Documentation
|
||||
description: Deploy external ingestion using GCS Composer to automate metadata and quality pipelines on Google Cloud environments.
|
||||
slug: /deployment/ingestion/external/gcs-composer
|
||||
title: Run the ingestion from GCP Composer | Official Documentation
|
||||
description: Deploy external ingestion using GCP Composer to automate metadata and quality pipelines on Google Cloud environments.
|
||||
slug: /deployment/ingestion/external/gcp-composer
|
||||
collate: false
|
||||
---
|
||||
|
||||
{% partial file="/v1.8/deployment/external-ingestion.md" /%}
|
||||
|
||||
# Run the ingestion from GCS Composer
|
||||
# Run the ingestion from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -19,7 +19,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
|
||||
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
|
||||
it will require you to install the packages and plugins directly on the host.
|
||||
|
||||
### Install the Requirements
|
||||
@ -98,7 +98,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -284,9 +284,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
icon="10k"
|
||||
bold="GCS Composer"
|
||||
bold="GCP Composer "
|
||||
href="/deployment/ingestion/external/gcs-composer" %}
|
||||
Run the ingestion process externally from GCS Composer
|
||||
Run the ingestion process externally from GCP Composer
|
||||
{% /inlineCallout %}
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
|
@ -1,12 +1,12 @@
|
||||
---
|
||||
title: Run the ingestion from GCS Composer
|
||||
slug: /getting-started/day-1/hybrid-saas/gcs-composer
|
||||
title: Run the ingestion from GCP Composer
|
||||
slug: /getting-started/day-1/hybrid-saas/gcp-composer
|
||||
collate: true
|
||||
---
|
||||
|
||||
{% partial file="/v1.8/deployment/external-ingestion.md" /%}
|
||||
|
||||
# Run the ingestion from GCS Composer
|
||||
# Run the ingestion from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -18,7 +18,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
|
||||
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
|
||||
it will require you to install the packages and plugins directly on the host.
|
||||
|
||||
### Install the Requirements
|
||||
@ -97,7 +97,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -309,9 +309,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
icon="10k"
|
||||
bold="GCS Composer"
|
||||
bold="GCP Composer "
|
||||
href="/deployment/ingestion/external/gcs-composer" %}
|
||||
Run the ingestion process externally from GCS Composer
|
||||
Run the ingestion process externally from GCP Composer
|
||||
{% /inlineCallout %}
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
|
@ -65,8 +65,8 @@ site_menu:
|
||||
url: /deployment/ingestion/external/airflow
|
||||
- category: Deployment / Ingestion / External / MWAA
|
||||
url: /deployment/ingestion/external/mwaa
|
||||
- category: Deployment / Ingestion / External / GCS Composer
|
||||
url: /deployment/ingestion/external/gcs-composer
|
||||
- category: Deployment / Ingestion / External / GCP Composer
|
||||
url: /deployment/ingestion/external/gcp-composer
|
||||
- category: Deployment / Ingestion / External / GitHub Actions
|
||||
url: /deployment/ingestion/external/github-actions
|
||||
- category: Deployment / Ingestion / External / Credentials
|
||||
@ -647,8 +647,8 @@ site_menu:
|
||||
url: /connectors/pipeline/airflow/configuring-lineage
|
||||
- category: Connectors / Pipeline / Airflow / MWAA
|
||||
url: /connectors/pipeline/airflow/mwaa
|
||||
- category: Connectors / Pipeline / Airflow / GCS Composer
|
||||
url: /connectors/pipeline/airflow/gcs-composer
|
||||
- category: Connectors / Pipeline / Airflow / GCP Composer
|
||||
url: /connectors/pipeline/airflow/gcp-composer
|
||||
- category: Connectors / Pipeline / Dagster
|
||||
url: /connectors/pipeline/dagster
|
||||
- category: Connectors / Pipeline / Dagster / Run Externally
|
||||
|
@ -14,8 +14,8 @@ site_menu:
|
||||
url: /getting-started/day-1/hybrid-saas/airflow
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / MWAA
|
||||
url: /getting-started/day-1/hybrid-saas/mwaa
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GCS Composer
|
||||
url: /getting-started/day-1/hybrid-saas/gcs-composer
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GCP Composer
|
||||
url: /getting-started/day-1/hybrid-saas/gcp-composer
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / GitHub Actions
|
||||
url: /getting-started/day-1/hybrid-saas/github-actions
|
||||
- category: Getting Started / Day 1 / Hybrid SaaS / Credentials
|
||||
@ -461,8 +461,8 @@ site_menu:
|
||||
url: /connectors/pipeline/airflow/configuring-lineage
|
||||
- category: Connectors / Pipeline / Airflow / MWAA
|
||||
url: /connectors/pipeline/airflow/mwaa
|
||||
- category: Connectors / Pipeline / Airflow / GCS Composer
|
||||
url: /connectors/pipeline/airflow/gcs-composer
|
||||
- category: Connectors / Pipeline / Airflow / GCP Composer
|
||||
url: /connectors/pipeline/airflow/gcp-composer
|
||||
- category: Connectors / Pipeline / Azure Data Factory
|
||||
url: /connectors/pipeline/datafactory
|
||||
- category: Connectors / Pipeline / Azure Data Factory / Run Externally
|
||||
|
@ -1,9 +1,9 @@
|
||||
---
|
||||
title: Extract Metadata from GCS Composer
|
||||
slug: /connectors/pipeline/airflow/gcs-composer
|
||||
title: Extract Metadata from GCP Composer
|
||||
slug: /connectors/pipeline/airflow/gcp-composer
|
||||
---
|
||||
|
||||
# Extract Metadata from GCS Composer
|
||||
# Extract Metadata from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -20,7 +20,7 @@ Feel free to choose whatever approach adapts best to your current architecture a
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to extract metadata out of GCS Composer is by directly creating a DAG in there
|
||||
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there
|
||||
that will handle the connection to the metadata database automatically and push the contents
|
||||
to your OpenMetadata server.
|
||||
|
||||
@ -129,7 +129,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -27,8 +27,8 @@ Configure and schedule Airflow metadata workflow from the OpenMetadata UI:
|
||||
link="/deployment/ingestion/external/mwaa"
|
||||
/ %}
|
||||
{% tile
|
||||
title="GCS Composer"
|
||||
description="Run the ingestion from GCS Composer."
|
||||
title="GCP Composer "
|
||||
description="Run the ingestion from GCP Composer ."
|
||||
link="/deployment/ingestion/external/gcs-composer"
|
||||
/ %}
|
||||
{% /tilesContainer %}
|
||||
|
@ -1,13 +1,13 @@
|
||||
---
|
||||
title: Run the ingestion from GCS Composer | Official Documentation
|
||||
description: Deploy external ingestion using GCS Composer to automate metadata and quality pipelines on Google Cloud environments.
|
||||
slug: /deployment/ingestion/external/gcs-composer
|
||||
title: Run the ingestion from GCP Composer | Official Documentation
|
||||
description: Deploy external ingestion using GCP Composer to automate metadata and quality pipelines on Google Cloud environments.
|
||||
slug: /deployment/ingestion/external/gcp-composer
|
||||
collate: false
|
||||
---
|
||||
|
||||
{% partial file="/v1.9/deployment/external-ingestion.md" /%}
|
||||
|
||||
# Run the ingestion from GCS Composer
|
||||
# Run the ingestion from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -19,7 +19,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
|
||||
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
|
||||
it will require you to install the packages and plugins directly on the host.
|
||||
|
||||
### Install the Requirements
|
||||
@ -98,7 +98,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -284,9 +284,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
icon="10k"
|
||||
bold="GCS Composer"
|
||||
bold="GCP Composer "
|
||||
href="/deployment/ingestion/external/gcs-composer" %}
|
||||
Run the ingestion process externally from GCS Composer
|
||||
Run the ingestion process externally from GCP Composer
|
||||
{% /inlineCallout %}
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
|
@ -1,12 +1,12 @@
|
||||
---
|
||||
title: Run the ingestion from GCS Composer
|
||||
slug: /getting-started/day-1/hybrid-saas/gcs-composer
|
||||
title: Run the ingestion from GCP Composer
|
||||
slug: /getting-started/day-1/hybrid-saas/gcp-composer
|
||||
collate: true
|
||||
---
|
||||
|
||||
{% partial file="/v1.9/deployment/external-ingestion.md" /%}
|
||||
|
||||
# Run the ingestion from GCS Composer
|
||||
# Run the ingestion from GCP Composer
|
||||
|
||||
## Requirements
|
||||
|
||||
@ -18,7 +18,7 @@ It also requires the ingestion package to be at least `openmetadata-ingestion==1
|
||||
|
||||
## Using the Python Operator
|
||||
|
||||
The most comfortable way to run the metadata workflows from GCS Composer is directly via a `PythonOperator`. Note that
|
||||
The most comfortable way to run the metadata workflows from GCP Composer is directly via a `PythonOperator`. Note that
|
||||
it will require you to install the packages and plugins directly on the host.
|
||||
|
||||
### Install the Requirements
|
||||
@ -97,7 +97,7 @@ with DAG(
|
||||
|
||||
## Using the Kubernetes Pod Operator
|
||||
|
||||
In this second approach we won't need to install absolutely anything to the GCS Composer environment. Instead,
|
||||
In this second approach we won't need to install absolutely anything to the GCP Composer environment. Instead,
|
||||
we will rely on the `KubernetesPodOperator` to use the underlying k8s cluster of Composer.
|
||||
|
||||
Then, the code won't directly run using the hosts' environment, but rather inside a container that we created
|
@ -309,9 +309,9 @@ don't hesitate to reach to us in [Slack](https://slack.open-metadata.org/) or di
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
icon="10k"
|
||||
bold="GCS Composer"
|
||||
bold="GCP Composer "
|
||||
href="/deployment/ingestion/external/gcs-composer" %}
|
||||
Run the ingestion process externally from GCS Composer
|
||||
Run the ingestion process externally from GCP Composer
|
||||
{% /inlineCallout %}
|
||||
{% inlineCallout
|
||||
color="violet-70"
|
||||
|
@ -65,8 +65,8 @@ site_menu:
|
||||
url: /deployment/ingestion/external/airflow
|
||||
- category: Deployment / Ingestion / External / MWAA
|
||||
url: /deployment/ingestion/external/mwaa
|
||||
- category: Deployment / Ingestion / External / GCS Composer
|
||||
url: /deployment/ingestion/external/gcs-composer
|
||||
- category: Deployment / Ingestion / External / GCP Composer
|
||||
url: /deployment/ingestion/external/gcp-composer
|
||||
- category: Deployment / Ingestion / External / GitHub Actions
|
||||
url: /deployment/ingestion/external/github-actions
|
||||
- category: Deployment / Ingestion / External / Credentials
|
||||
@ -647,8 +647,8 @@ site_menu:
|
||||
url: /connectors/pipeline/airflow/configuring-lineage
|
||||
- category: Connectors / Pipeline / Airflow / MWAA
|
||||
url: /connectors/pipeline/airflow/mwaa
|
||||
- category: Connectors / Pipeline / Airflow / GCS Composer
|
||||
url: /connectors/pipeline/airflow/gcs-composer
|
||||
- category: Connectors / Pipeline / Airflow / GCP Composer
|
||||
url: /connectors/pipeline/airflow/gcp-composer
|
||||
- category: Connectors / Pipeline / Dagster
|
||||
url: /connectors/pipeline/dagster
|
||||
- category: Connectors / Pipeline / Dagster / Run Externally
|
||||
|
Loading…
x
Reference in New Issue
Block a user