Fix: Update requirements title in MD files (#11172)

This commit is contained in:
Nahuel 2023-04-21 09:07:15 +02:00 committed by GitHub
parent 00fe67bb83
commit d11a8b6a71
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
51 changed files with 51 additions and 51 deletions

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the CustomDashboard connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/customdashboard).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the DomoDashboard connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/domodashboard).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Looker connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/looker).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Metabase connector.
# Requirements
## Requirements
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Mode connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/mode).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the QuickSight connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/quicksight).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Redash connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/redash).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Superset connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/dashboard/superset).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Tableau connector.
# Requirements
## Requirements
To ingest Tableau metadata, the username used in the configuration **must** have at least the following role: `Site Role: Viewer`.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Athena connector.
# Requirements
## Requirements
The Athena connector ingests metadata through JDBC connections.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the AzureSQL connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/azuresql).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the BigQuery connector.
# Requirements
## Requirements
<InlineCallout color="violet-70" icon="description" bold="OpenMetadata 0.12 or later" href="/deployment">
To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Clickhouse connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/clickhouse).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the CustomDatabase connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/customdatabase).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Databricks connector. You can view the full documentation for Databricks [here](https://docs.open-metadata.org/connectors/database/databricks).
# Requirements
## Requirements
You can find further information on the Databricks connector in the [docs](https://docs.open-metadata.org/connectors/database/databricks).
To learn more about the Databricks Connection Details(`hostPort`,`token`, `http_path`) information visit this [docs](https://docs.open-metadata.org/connectors/database/databricks/troubleshooting).
$$note

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Datalake connector.
# Requirements
## Requirements
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Db2 connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/db2).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the DeltaLake connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/deltalake).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the DomoDatabase connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/domodatabase).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Druid connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/druid).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the DynamoDB connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/dynamodb).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Glue connector.
# Requirements
## Requirements
The Glue connector ingests metadata through AWS [Boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/glue.html) Client.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Mssql connector.
# Requirements
## Requirements
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Oracle connector.
# Requirements
## Requirements
To ingest metadata from Oracle, a user must have `CREATE SESSION` privilege.
```sql

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the PinotDB connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/pinotdb).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the SQLite connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/sqlite).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Salesforce connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/salesforce).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Snowflake connector.
# Requirements
## Requirements
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Vertica connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/database/vertica).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the CustomMessaging connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/messaging/custommessaging).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Kafka connector.
# Requirements
## Requirements
Connecting to Kafka does not require any previous configuration.
Just to remind you, the ingestion of the Kafka topics schema is done separately by configuring the **Schema Registry URL**. However, only the **Bootstrap Servers** information is mandatory.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Kinesis connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/messaging/kinesis).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Pulsar connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/messaging/pulsar).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Redpanda connector.
# Requirements
## Requirements
Connecting to Redpanda does not require any previous configuration.
Just to remind you, the ingestion of the Redpanda topics schema is done separately by configuring the **Schema Registry URL**. However, only the **Bootstrap Servers** information is mandatory.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Amundsen connector.
# Requirements
## Requirements
For Connecting to Amundsen, need to make sure to pass `hostPort` with prefix such as:
1. `bolt`(`Recommended`)
2. `neo4j`

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Atlas connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/metadata/atlas).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the OpenMetadata connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/metadata/openmetadata).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the CustomMlModel connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/mlmodel/custommlmodel).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the SageMaker connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/mlmodel/sagemaker).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Sklearn connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/mlmodel/sklearn).

View File

@ -1,7 +1,7 @@
# Airbyte
In this section, we provide guides and references to use the Airbyte connector. You can view the full documentation for Airbyte [here](https://docs.open-metadata.org/connectors/pipeline/airbyte).
# Requirements
## Requirements
You can find further information on the Airbyte connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/airbyte).
## Connection Details

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Airflow connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/airflow).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the CustomPipeline connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/custompipeline).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Dagster connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/dagster).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Databricks Pipeline connector. You can view the full documentation for DatabricksPipeline [here](https://docs.open-metadata.org/connectors/pipeline/databrickspipeline).
# Requirements
## Requirements
You can find further information on the Databricks Pipeline connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/databrickspipeline).
To learn more about the Databricks Connection Details(`hostPort`,`token`, `http_path`) information visit this [docs](https://docs.open-metadata.org/connectors/database/databricks/troubleshooting).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the DomoPipeline connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/domopipeline).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Fivetran connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/pipeline/fivetran).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the Glue connector for Pipeline Services.
# Requirements
## Requirements
The Glue connector ingests metadata through AWS [Boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/glue.html) Client.
We will ingest Workflows, its jobs and their run status.

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the AzureObjectStore connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/objectstore/azure).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the GcsObjectStore connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/objectstore/gcs).

View File

@ -2,7 +2,7 @@
In this section, we provide guides and references to use the S3ObjectStore connector.
# Requirements
## Requirements
<!-- to be updated -->
You can find further information on the Kafka connector in the [docs](https://docs.open-metadata.org/connectors/objectstore/s3).