mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-08-23 08:28:10 +00:00
GitBook: [#152] Usage - Edits
This commit is contained in:
parent
ed3def7de2
commit
3f5aa6391b
@ -10,25 +10,25 @@ OpenMetadata enables metadata management end-to-end, giving you the ability to u
|
||||
|
||||
OpenMetadata provides connectors that enable you to perform metadata ingestion from a number of common database, dashboard, messaging, and pipeline services. With each release, we add additional connectors and the ingestion framework provides a structured and straightforward method for creating your own connectors. See the table below for a list of supported connectors.
|
||||
|
||||
| A-H | I-M | N-R | S-Z |
|
||||
| ------------------------------------------------------------------------ | -------------------------------------------------------------- | -------------------------------------------------------------------- | ----------------------------------------------------------------------- |
|
||||
| [Airflow](integrations/airflow/airflow.md) | [IBM Db2](integrations/connectors/ibm-db2.md) | [Oracle](integrations/connectors/oracle.md) | [Salesforce](integrations/connectors/salesforce.md) |
|
||||
| Amundsen | [Kafka](integrations/connectors/kafka.md) | [Postgres](integrations/connectors/postgres/) | [SingleStore](integrations/connectors/singlestore.md) |
|
||||
| Apache Atlas | LDAP | Power BI | [Snowflake](integrations/connectors/snowflake/) |
|
||||
| Apache Druid | [Looker](integrations/connectors/looker.md) | Prefect | [Snowflake Usage](integrations/connectors/snowflake/snowflake-usage.md) |
|
||||
| [Athena](integrations/connectors/athena.md) | [MariaDB](integrations/connectors/mariadb.md) | [Presto](integrations/connectors/presto.md) | [Superset](integrations/connectors/superset.md) |
|
||||
| [Azure SQL](integrations/connectors/azure-sql.md) | [Metabase](integrations/connectors/metabase.md) | [Redash](integrations/connectors/redash.md) | [Tableau](integrations/connectors/tableau.md) |
|
||||
| [BigQuery](integrations/connectors/bigquery/) | [MLflow](integrations/connectors/mlflow/) | [Redshift](integrations/connectors/redshift/) | [Trino](integrations/connectors/trino.md) |
|
||||
| [BigQuery Usage](integrations/connectors/bigquery/bigquery-usage.md) | [MSSQL](integrations/connectors/mssql/) | [Redshift Usage](integrations/connectors/redshift/redshift-usage.md) | [Vertica](integrations/connectors/vertica.md) |
|
||||
| [ClickHouse](integrations/connectors/clickhouse/) | [MSSQL Usage](integrations/connectors/mssql/redshift-usage.md) | | |
|
||||
| [ClickHouse Usage](integrations/connectors/clickhouse/redshift-usage.md) | [MySQL](integrations/connectors/mysql/mysql.md) | | |
|
||||
| [Databricks](integrations/connectors/databricks.md) | | | |
|
||||
| [DBT](integrations/connectors/dbt.md) | | | |
|
||||
| [Delta Lake](integrations/connectors/delta-lake.md) | | | |
|
||||
| [DynamoDB](integrations/connectors/dynamodb.md) | | | |
|
||||
| [Elasticsearch](integrations/connectors/elastic-search.md) | | | |
|
||||
| [Glue Catalog](integrations/connectors/glue-catalog/) | | | |
|
||||
| [Hive](integrations/connectors/hive.md) | | | |
|
||||
| A-H | I-M | N-R | S-Z |
|
||||
| -------------------------------------------------------------------------- | ----------------------------------------------------------- | -------------------------------------------------------------------- | ----------------------------------------------------------------------- |
|
||||
| [Airflow](integrations/airflow/airflow.md) | [IBM Db2](integrations/connectors/ibm-db2.md) | [Oracle](integrations/connectors/oracle.md) | [Salesforce](integrations/connectors/salesforce.md) |
|
||||
| Amundsen | [Kafka](integrations/connectors/kafka.md) | [Postgres](integrations/connectors/postgres/) | [SingleStore](integrations/connectors/singlestore.md) |
|
||||
| Apache Atlas | LDAP | Power BI | [Snowflake](integrations/connectors/snowflake/) |
|
||||
| Apache Druid | [Looker](integrations/connectors/looker.md) | Prefect | [Snowflake Usage](integrations/connectors/snowflake/snowflake-usage.md) |
|
||||
| [Athena](integrations/connectors/athena.md) | [MariaDB](integrations/connectors/mariadb.md) | [Presto](integrations/connectors/presto.md) | [Superset](integrations/connectors/superset.md) |
|
||||
| [Azure SQL](integrations/connectors/azure-sql.md) | [Metabase](integrations/connectors/metabase.md) | [Redash](integrations/connectors/redash.md) | [Tableau](integrations/connectors/tableau.md) |
|
||||
| [BigQuery](integrations/connectors/bigquery/) | [MLflow](integrations/connectors/mlflow/) | [Redshift](integrations/connectors/redshift/) | [Trino](integrations/connectors/trino.md) |
|
||||
| [BigQuery Usage](integrations/connectors/bigquery/bigquery-usage.md) | [MSSQL](integrations/connectors/mssql/) | [Redshift Usage](integrations/connectors/redshift/redshift-usage.md) | [Vertica](integrations/connectors/vertica.md) |
|
||||
| [ClickHouse](integrations/connectors/clickhouse/) | [MSSQL Usage](integrations/connectors/mssql/mssql-usage.md) | | |
|
||||
| [ClickHouse Usage](integrations/connectors/clickhouse/clickhouse-usage.md) | [MySQL](integrations/connectors/mysql/mysql.md) | | |
|
||||
| [Databricks](integrations/connectors/databricks.md) | | | |
|
||||
| [DBT](integrations/connectors/dbt.md) | | | |
|
||||
| [Delta Lake](integrations/connectors/delta-lake.md) | | | |
|
||||
| [DynamoDB](integrations/connectors/dynamodb.md) | | | |
|
||||
| [Elasticsearch](integrations/connectors/elastic-search.md) | | | |
|
||||
| [Glue Catalog](integrations/connectors/glue-catalog/) | | | |
|
||||
| [Hive](integrations/connectors/hive.md) | | | |
|
||||
|
||||
## OpenMetadata Components
|
||||
|
||||
|
@ -23,7 +23,7 @@
|
||||
* [BigQuery Usage](integrations/connectors/bigquery/bigquery-usage.md)
|
||||
* [ClickHouse](integrations/connectors/clickhouse/README.md)
|
||||
* [ClickHouse Metadata Extraction](integrations/connectors/clickhouse/mysql.md)
|
||||
* [ClickHouse Usage](integrations/connectors/clickhouse/redshift-usage.md)
|
||||
* [ClickHouse Usage](integrations/connectors/clickhouse/clickhouse-usage.md)
|
||||
* [Databricks](integrations/connectors/databricks.md)
|
||||
* [DBT](integrations/connectors/dbt.md)
|
||||
* [Delta Lake](integrations/connectors/delta-lake.md)
|
||||
@ -41,7 +41,7 @@
|
||||
* [MLflow Metadata Extraction](integrations/connectors/mlflow/mlflow-metadata-extraction.md)
|
||||
* [MSSQL](integrations/connectors/mssql/README.md)
|
||||
* [MSSQL Metadata Extraction](integrations/connectors/mssql/mssql-metadata-extraction.md)
|
||||
* [MSSQL Usage](integrations/connectors/mssql/redshift-usage.md)
|
||||
* [MSSQL Usage](integrations/connectors/mssql/mssql-usage.md)
|
||||
* [MySQL](integrations/connectors/mysql/README.md)
|
||||
* [MySQL Metadata Extraction](integrations/connectors/mysql/mysql.md)
|
||||
* [Oracle](integrations/connectors/oracle.md)
|
||||
|
@ -10,7 +10,7 @@ OpenMetadata supports connectors to some popular services. We will continue as a
|
||||
* [BigQuery](bigquery/)
|
||||
* [BigQuery Usage](bigquery/bigquery-usage.md)
|
||||
* [ClickHouse](clickhouse/)
|
||||
* [ClickHouse Usage](clickhouse/redshift-usage.md)
|
||||
* [ClickHouse Usage](clickhouse/clickhouse-usage.md)
|
||||
* [Databricks](databricks.md)
|
||||
* [Delta Lake](delta-lake.md)
|
||||
* [DynamoDB](dynamodb.md)
|
||||
@ -20,7 +20,7 @@ OpenMetadata supports connectors to some popular services. We will continue as a
|
||||
* [IBM Db2](ibm-db2.md)
|
||||
* [MariaDB](mariadb.md)
|
||||
* [MSSQL](mssql/)
|
||||
* [MSSQL Usage](mssql/redshift-usage.md)
|
||||
* [MSSQL Usage](mssql/mssql-usage.md)
|
||||
* [MySQL](mysql/mysql.md)
|
||||
* [Oracle](oracle.md)
|
||||
* [Postgres](postgres/)
|
||||
|
@ -10,6 +10,6 @@ description: >-
|
||||
[mysql.md](mysql.md)
|
||||
{% endcontent-ref %}
|
||||
|
||||
{% content-ref url="redshift-usage.md" %}
|
||||
[redshift-usage.md](redshift-usage.md)
|
||||
{% content-ref url="clickhouse-usage.md" %}
|
||||
[clickhouse-usage.md](clickhouse-usage.md)
|
||||
{% endcontent-ref %}
|
||||
|
@ -8,22 +8,15 @@ description: >-
|
||||
|
||||
## **Requirements**
|
||||
|
||||
Using the OpenMetadata ClickHouse Usage connector requires supporting services and software. Please ensure that your host system meets the requirements listed below. Then continue to follow the procedure for installing and configuring this connector.
|
||||
Please ensure that your host system meets the requirements listed below.
|
||||
|
||||
### **OpenMetadata (version 0.8.0 or later)**
|
||||
### **OpenMetadata (version 0.9.0 or later)**
|
||||
|
||||
You must have a running deployment of OpenMetadata to use this guide. OpenMetadata includes the following services:
|
||||
|
||||
* OpenMetadata server supporting the metadata APIs and user interface
|
||||
* Elasticsearch for metadata search and discovery
|
||||
* MySQL as the backing store for all metadata
|
||||
* Airflow for metadata ingestion workflows
|
||||
|
||||
If you have not already deployed OpenMetadata, please follow the instructions to [Run OpenMetadata](https://docs.open-metadata.org/install/run-openmetadata) to get up and running.
|
||||
To deploy OpenMetadata, follow the procedure [Try OpenMetadata in Docker](https://docs.open-metadata.org/install/run-openmetadata).Use the following command to check your Python version.
|
||||
|
||||
### **Python (version 3.8.0 or later)**
|
||||
|
||||
Please use the following command to check the version of Python you have.
|
||||
Use the following command to check your Python version.
|
||||
|
||||
```
|
||||
python3 --version
|
||||
@ -33,18 +26,18 @@ python3 --version
|
||||
|
||||
Here’s an overview of the steps in this procedure. Please follow the steps relevant to your use case.
|
||||
|
||||
1. [Prepare a Python virtual environment](redshift-usage.md#1.-prepare-a-python-virtual-environment)
|
||||
2. [Install the Python module for this connector](redshift-usage.md#2.-install-the-python-module-for-this-connector)
|
||||
3. [Create a configuration file using template JSON](redshift-usage.md#3.-create-a-configuration-file-using-template-json)
|
||||
4. [Configure service settings](redshift-usage.md#4.-configure-service-settings)
|
||||
5. [Enable/disable the data profiler](redshift-usage.md#5.-enable-disable-the-data-profiler)
|
||||
6. [Install the data profiler Python module (optional)](redshift-usage.md#6.-install-the-data-profiler-python-module-optional)
|
||||
7. [Configure data filters (optional)](redshift-usage.md#7.-configure-data-filters-optional)
|
||||
8. [Configure sample data (optional)](redshift-usage.md#8.-configure-sample-data-optional)
|
||||
9. [Configure DBT (optional)](redshift-usage.md#9.-configure-dbt-optional)
|
||||
10. [Confirm sink settings](redshift-usage.md#10.-confirm-sink-settings)
|
||||
11. [Confirm metadata\_server settings](redshift-usage.md#11.-confirm-metadata\_server-settings)
|
||||
12. [Run ingestion workflow](redshift-usage.md#12.-run-ingestion-workflow)
|
||||
1. [Prepare a Python virtual environment](clickhouse-usage.md#1.-prepare-a-python-virtual-environment)
|
||||
2. [Install the Python module for this connector](clickhouse-usage.md#2.-install-the-python-module-for-this-connector)
|
||||
3. [Create a configuration file using template JSON](clickhouse-usage.md#3.-create-a-configuration-file-using-template-json)
|
||||
4. [Configure service settings](clickhouse-usage.md#4.-configure-service-settings)
|
||||
5. [Enable/disable the data profiler](clickhouse-usage.md#5.-enable-disable-the-data-profiler)
|
||||
6. [Install the data profiler Python module (optional)](clickhouse-usage.md#6.-install-the-data-profiler-python-module-optional)
|
||||
7. [Configure data filters (optional)](clickhouse-usage.md#7.-configure-data-filters-optional)
|
||||
8. [Configure sample data (optional)](clickhouse-usage.md#8.-configure-sample-data-optional)
|
||||
9. [Configure DBT (optional)](clickhouse-usage.md#9.-configure-dbt-optional)
|
||||
10. [Confirm sink settings](clickhouse-usage.md#10.-confirm-sink-settings)
|
||||
11. [Confirm metadata\_server settings](clickhouse-usage.md#11.-confirm-metadata\_server-settings)
|
||||
12. [Run ingestion workflow](clickhouse-usage.md#12.-run-ingestion-workflow)
|
||||
|
||||
### **1. Prepare a Python virtual environment**
|
||||
|
||||
@ -316,7 +309,7 @@ You may use either `excludes` or `includes` but not both in `table_filter_patter
|
||||
|
||||
Use `source.config.schema_filter_pattern.excludes` and `source.config.schema_filter_pattern.includes` field to select the schemas for metadata ingestion by name. The configuration template provides an example.
|
||||
|
||||
The syntax and semantics for `schema_filter_pattern` are the same as for [`table_filter_pattern`](redshift-usage.md#table\_filter\_pattern-optional). Please check that section for details.
|
||||
The syntax and semantics for `schema_filter_pattern` are the same as for [`table_filter_pattern`](clickhouse-usage.md#table\_filter\_pattern-optional). Please check that section for details.
|
||||
|
||||
### **8. Configure sample data (optional)**
|
||||
|
||||
@ -426,7 +419,7 @@ ERROR: Could not build wheels for cryptography which use PEP 517 and cannot be i
|
||||
pip3 install --upgrade pip setuptools
|
||||
```
|
||||
|
||||
Then re-run the install command in [Step 2](redshift-usage.md#2.-install-the-python-module-for-this-connector).
|
||||
Then re-run the install command in [Step 2](clickhouse-usage.md#2.-install-the-python-module-for-this-connector).
|
||||
|
||||
### **requests.exceptions.ConnectionError**
|
||||
|
||||
@ -441,4 +434,4 @@ Failed to establish a new connection: [Errno 61] Connection refused'))
|
||||
|
||||
To correct this problem, please follow the steps in the [Run OpenMetadata ](https://docs.open-metadata.org/install/run-openmetadata)guide to deploy OpenMetadata in Docker on your local machine.
|
||||
|
||||
Then re-run the metadata ingestion workflow in [Step 12](redshift-usage.md#12.-run-ingestion-workflow).
|
||||
Then re-run the metadata ingestion workflow in [Step 12](clickhouse-usage.md#12.-run-ingestion-workflow).
|
@ -8,6 +8,6 @@ description: In this section, we provide guides and reference to use the MSSQL c
|
||||
[mssql-metadata-extraction.md](mssql-metadata-extraction.md)
|
||||
{% endcontent-ref %}
|
||||
|
||||
{% content-ref url="redshift-usage.md" %}
|
||||
[redshift-usage.md](redshift-usage.md)
|
||||
{% content-ref url="mssql-usage.md" %}
|
||||
[mssql-usage.md](mssql-usage.md)
|
||||
{% endcontent-ref %}
|
||||
|
@ -26,18 +26,18 @@ python3 --version
|
||||
|
||||
Here’s an overview of the steps in this procedure. Please follow the steps relevant to your use case.
|
||||
|
||||
1. [Prepare a Python virtual environment](redshift-usage.md#1.-prepare-a-python-virtual-environment)
|
||||
2. [Install the Python module for this connector](redshift-usage.md#2.-install-the-python-module-for-this-connector)
|
||||
3. [Create a configuration file using template JSON](redshift-usage.md#3.-create-a-configuration-file-using-template-json)
|
||||
4. [Configure service settings](redshift-usage.md#4.-configure-service-settings)
|
||||
5. [Enable/disable the data profiler](redshift-usage.md#5.-enable-disable-the-data-profiler)
|
||||
6. [Install the data profiler Python module (optional)](redshift-usage.md#6.-install-the-data-profiler-python-module-optional)
|
||||
7. [Configure data filters (optional)](redshift-usage.md#7.-configure-data-filters-optional)
|
||||
8. [Configure sample data (optional)](redshift-usage.md#8.-configure-sample-data-optional)
|
||||
9. [Configure DBT (optional)](redshift-usage.md#9.-configure-dbt-optional)
|
||||
10. [Confirm sink settings](redshift-usage.md#10.-confirm-sink-settings)
|
||||
11. [Confirm metadata\_server settings](redshift-usage.md#11.-confirm-metadata\_server-settings)
|
||||
12. [Run ingestion workflow](redshift-usage.md#12.-run-ingestion-workflow)
|
||||
1. [Prepare a Python virtual environment](mssql-usage.md#1.-prepare-a-python-virtual-environment)
|
||||
2. [Install the Python module for this connector](mssql-usage.md#2.-install-the-python-module-for-this-connector)
|
||||
3. [Create a configuration file using template JSON](mssql-usage.md#3.-create-a-configuration-file-using-template-json)
|
||||
4. [Configure service settings](mssql-usage.md#4.-configure-service-settings)
|
||||
5. [Enable/disable the data profiler](mssql-usage.md#5.-enable-disable-the-data-profiler)
|
||||
6. [Install the data profiler Python module (optional)](mssql-usage.md#6.-install-the-data-profiler-python-module-optional)
|
||||
7. [Configure data filters (optional)](mssql-usage.md#7.-configure-data-filters-optional)
|
||||
8. [Configure sample data (optional)](mssql-usage.md#8.-configure-sample-data-optional)
|
||||
9. [Configure DBT (optional)](mssql-usage.md#9.-configure-dbt-optional)
|
||||
10. [Confirm sink settings](mssql-usage.md#10.-confirm-sink-settings)
|
||||
11. [Confirm metadata\_server settings](mssql-usage.md#11.-confirm-metadata\_server-settings)
|
||||
12. [Run ingestion workflow](mssql-usage.md#12.-run-ingestion-workflow)
|
||||
|
||||
### **1. Prepare a Python virtual environment**
|
||||
|
||||
@ -309,7 +309,7 @@ You may use either `excludes` or `includes` but not both in `table_filter_patter
|
||||
|
||||
Use `source.config.schema_filter_pattern.excludes` and `source.config.schema_filter_pattern.includes` field to select the schemas for metadata ingestion by name. The configuration template provides an example.
|
||||
|
||||
The syntax and semantics for `schema_filter_pattern` are the same as for [`table_filter_pattern`](redshift-usage.md#table\_filter\_pattern-optional). Please check that section for details.
|
||||
The syntax and semantics for `schema_filter_pattern` are the same as for [`table_filter_pattern`](mssql-usage.md#table\_filter\_pattern-optional). Please check that section for details.
|
||||
|
||||
### **8. Configure sample data (optional)**
|
||||
|
||||
@ -419,7 +419,7 @@ ERROR: Could not build wheels for cryptography which use PEP 517 and cannot be i
|
||||
pip3 install --upgrade pip setuptools
|
||||
```
|
||||
|
||||
Then re-run the install command in [Step 2](redshift-usage.md#2.-install-the-python-module-for-this-connector).
|
||||
Then re-run the install command in [Step 2](mssql-usage.md#2.-install-the-python-module-for-this-connector).
|
||||
|
||||
### **requests.exceptions.ConnectionError**
|
||||
|
||||
@ -434,4 +434,4 @@ Failed to establish a new connection: [Errno 61] Connection refused'))
|
||||
|
||||
To correct this problem, please follow the steps in the [Run OpenMetadata ](https://docs.open-metadata.org/install/run-openmetadata)guide to deploy OpenMetadata in Docker on your local machine.
|
||||
|
||||
Then re-run the metadata ingestion workflow in [Step 12](redshift-usage.md#12.-run-ingestion-workflow).
|
||||
Then re-run the metadata ingestion workflow in [Step 12](mssql-usage.md#12.-run-ingestion-workflow).
|
@ -8,22 +8,15 @@ description: >-
|
||||
|
||||
## **Requirements**
|
||||
|
||||
Using the OpenMetadata Redshift Usage connector requires supporting services and software. Please ensure that your host system meets the requirements listed below. Then continue to follow the procedure for installing and configuring this connector.
|
||||
Please ensure that your host system meets the requirements listed below.
|
||||
|
||||
### **OpenMetadata (version 0.8.0 or later)**
|
||||
### **OpenMetadata (version 0.9.0 or later)**
|
||||
|
||||
You must have a running deployment of OpenMetadata to use this guide. OpenMetadata includes the following services:
|
||||
|
||||
* OpenMetadata server supporting the metadata APIs and user interface
|
||||
* Elasticsearch for metadata search and discovery
|
||||
* MySQL as the backing store for all metadata
|
||||
* Airflow for metadata ingestion workflows
|
||||
|
||||
If you have not already deployed OpenMetadata, please follow the instructions to [Run OpenMetadata](https://docs.open-metadata.org/install/run-openmetadata) to get up and running.
|
||||
To deploy OpenMetadata, follow the procedure [Try OpenMetadata in Docker](https://docs.open-metadata.org/install/run-openmetadata).
|
||||
|
||||
### **Python (version 3.8.0 or later)**
|
||||
|
||||
Please use the following command to check the version of Python you have.
|
||||
Use the following command to check your Python version.
|
||||
|
||||
```
|
||||
python3 --version
|
||||
|
@ -8,22 +8,15 @@ description: >-
|
||||
|
||||
## **Requirements**
|
||||
|
||||
Using the OpenMetadata Snowflake Usage connector requires supporting services and software. Please ensure that your host system meets the requirements listed below. Then continue to follow the procedure for installing and configuring this connector.
|
||||
Please ensure that your host system meets the requirements listed below.
|
||||
|
||||
### **OpenMetadata (version 0.8.0 or later)**
|
||||
|
||||
You must have a running deployment of OpenMetadata to use this guide. OpenMetadata includes the following services:
|
||||
|
||||
* OpenMetadata server supporting the metadata APIs and user interface
|
||||
* Elasticsearch for metadata search and discovery
|
||||
* MySQL as the backing store for all metadata
|
||||
* Airflow for metadata ingestion workflows
|
||||
|
||||
If you have not already deployed OpenMetadata, please follow the instructions to [Run OpenMetadata](https://docs.open-metadata.org/install/run-openmetadata) to get up and running.
|
||||
To deploy OpenMetadata, follow the procedure [Try OpenMetadata in Docker](https://docs.open-metadata.org/install/run-openmetadata).
|
||||
|
||||
### **Python (version 3.8.0 or later)**
|
||||
|
||||
Please use the following command to check the version of Python you have.
|
||||
Use the following command to check your Python version.
|
||||
|
||||
```
|
||||
python3 --version
|
||||
|
Loading…
x
Reference in New Issue
Block a user