Adding articles to How-to Guides (#13068)

This commit is contained in:
Shilpa Vernekar 2023-09-04 11:44:54 +05:30 committed by GitHub
parent 029786d773
commit f8a534ab4a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
26 changed files with 279 additions and 36 deletions

View File

@ -1,13 +1,13 @@
---
title: Mlflow
title: MLflow
slug: /connectors/ml-model/mlflow
---
# Mlflow
# MLflow
In this section, we provide guides and references to use the Mlflow connector.
In this section, we provide guides and references to use the MLflow connector.
Configure and schedule Mlflow metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule MLflow metadata and profiler workflows from the OpenMetadata UI:
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
@ -37,8 +37,8 @@ To extract metadata, OpenMetadata needs two elements:
#### Connection Details
- **trackingUri**: Mlflow Experiment tracking URI. E.g., http://localhost:5000
- **registryUri**: Mlflow Model registry backend. E.g., mysql+pymysql://mlflow:password@localhost:3307/experiments
- **trackingUri**: MLflow Experiment tracking URI. E.g., http://localhost:5000
- **registryUri**: MLflow Model registry backend. E.g., mysql+pymysql://mlflow:password@localhost:3307/experiments
{% /extraContent %}

View File

@ -1,13 +1,13 @@
---
title: Nifi
title: NiFi
slug: /connectors/pipeline/nifi
---
# Nifi
# NiFi
In this section, we provide guides and references to use the Nifi connector.
In this section, we provide guides and references to use the NiFi connector.
Configure and schedule Nifi metadata workflows from the OpenMetadata UI:
Configure and schedule NiFi metadata workflows from the OpenMetadata UI:
- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
@ -21,11 +21,11 @@ To deploy OpenMetadata, check the Deployment guides.
{% /inlineCallout %}
### Metadata
OpenMetadata supports 2 types of connection for the Nifi connector:
- **basic authentication**: use username/password to authenticate to Nifi.
OpenMetadata supports 2 types of connection for the NiFi connector:
- **basic authentication**: use username/password to authenticate to NiFi.
- **client certificate authentication**: use CA, client certificate and client key files to authenticate.
The user should be able to send request to the Nifi API and access the `Resources` endpoint.
The user should be able to send request to the NiFi API and access the `Resources` endpoint.
## Metadata Ingestion

View File

@ -6,7 +6,7 @@ slug: /deployment
# Deploy OpenMetadata in Production
{%note%}
Are you looking to do POC? It won't get easier than following our [Quickstart](/quickstart) guide!
Are you looking to do POC? It won't get easier than following our [Quickstart](/quick-start) guide!
{%/note%}

View File

@ -6,3 +6,35 @@ slug: /how-to-guides
# How to Guides
How to Guides will give you a walk through on `How to do things in OpenMetadata`.
# Overview of OpenMetadata
## What is OpenMetadata?
OpenMetadata is an all-in-one platform for data discovery, lineage, data quality, observability, governance, and team collaboration. It is one of the fastest-growing open-source projects with a vibrant community and adoption by a diverse set of companies in a variety of industry verticals. Powered by a centralized metadata store based on Open Metadata Standards/APIs, supporting connectors to a wide range of data services, OpenMetadata enables end-to-end metadata management, giving you the freedom to unlock the value of your data assets.
## How OpenMetadata helps Data Teams?
OpenMetadata is a complete package for data teams to break down team silos, share data assets from multiple sources securely, collaborate around data, and build a documentation-first data culture in the organization.
{% note %}
- Centralized, **Single Source of Truth** for all your metadata.
- **Discover** the right assets in time and reduce dependencies.
- Foster **Team Collaboration** with conversations, tasks, announcements, and alerts in real time.
- Build trust in your data with **Data Quality Tests** to ensure completeness and accuracy.
- Track your data evolution with end-to-end **Data Lineage**.
- Secure access to sensitive data by defining **Roles and Policies**.
- Enhance organizational **Data Culture** to gain crucial insights to drive innovation.
- Define your **Glossary** to build a common understanding of terms within your organization.
- Implement **Data Governance** to maintain data integrity, security, and compliance.
{% /note %}

View File

@ -0,0 +1,20 @@
---
title: How to Delete a Service Connection
slug: /how-to-guides/quick-start-guide-for-admins/how-to-ingest-metadata/delete-service-connection
---
# How to Delete a Service Connection
To delete a service connection, navigate to the service page and click on the ⋮ icon on the right of the page, and click on **Delete**.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/delete1.png"
alt="Delete a Service Connection"
caption="Delete a Service Connection"
/%}
To permanently delete the database, type DELETE and Confirm.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/delete2.png"
alt="Permanently Delete the Database"
caption="Permanently Delete the Database"
/%}

View File

@ -0,0 +1,170 @@
---
title: How to Ingest Metadata
slug: /how-to-guides/quick-start-guide-for-admins/how-to-ingest-metadata
---
# How to Ingest Metadata
*This section deals with integrating third-party sources with OpenMetadata and running the workflows from the UI.*
OpenMetadata gives you the flexibility to bring in your data from third-party sources using CLI, or the UI. Lets start with ingesting your metadata from various sources through the UI. Follow the easy steps to add a connector to fetch metadata on a regular basis at your desired frequency.
{% note %}
**Note:** Ensure that you have **Admin access** in the source tools to be able to add a connector and ingest metadata.
{% /note %}
Admin users can connect to multiple data sources like Databases, Dashboards, Pipelines, ML Models, Messaging, Storage, as well as Metadata services.
{%note%}
{%inlineCallout
color="violet-70"
bold="Connector Documentation"
icon="add_moderator"
href="/connectors"%}
Refer to the Docs to ingest metadata from multiple sources - Databases, Dashboards, Pipelines, ML Models, Messaging, Storage, as well as Metadata services.
{%/inlineCallout%}
- **Database Services:** [Athena](/connectors/database/athena), [AzureSQL](/connectors/database/azuresql), [BigQuery](/connectors/database/bigquery), [Clickhouse](/connectors/database/clickhouse), [Databricks](/connectors/database/databricks), [Datalake](/connectors/database/datalake), [DB2](/connectors/database/db2), [DeltaLake](/connectors/database/deltalake), [Domo Database](/connectors/database/domo-database), [Druid](/connectors/database/druid), [DynamoDB](/connectors/database/dynamodb), [Glue](/connectors/database/glue), [Hive](/connectors/database/hive), [Impala](/connectors/database/impala), [MariaDB](/connectors/database/mariadb), [MongoDB](/connectors/database/mongodb), [MSSQL](/connectors/database/mssql), [MySQL](/connectors/database/mysql), [Oracle](/connectors/database/oracle), [PinotDB](/connectors/database/pinotdb), [Postgres](/connectors/database/postgres), [Presto](/connectors/database/presto), [Redshift](/connectors/database/redshift), [Salesforce](/connectors/database/salesforce), [SAP Hana](/connectors/database/sap-hana), [SingleStore](/connectors/database/singlestore), [Snowflake](/connectors/database/snowflake), [SQLite](/connectors/database/sqlite), [Trino](/connectors/database/trino), and [Vertica](/connectors/database/vertica).
- **Dashboard Services:** [Domo Dashboard](/connectors/dashboard/domo-dashboard), [Looker](/connectors/dashboard/looker), [Metabase](/connectors/dashboard/metabase), [Mode](/connectors/dashboard/mode), [PowerBI](/connectors/dashboard/powerbi), [Qlik Sense](/connectors/dashboard/qliksense), [QuickSight](/connectors/dashboard/quicksight), [Redash](/connectors/dashboard/redash), [Superset](/connectors/dashboard/superset), and [Tableau](/connectors/dashboard/tableau).
- **Messaging Services:** [Kafka](/connectors/messaging/kafka), [Kinesis](/connectors/messaging/kinesis), and [Redpanda](/connectors/messaging/redpanda).
- **Pipeline Services:** [Airbyte](/connectors/pipeline/airbyte), [Airflow](/connectors/pipeline/airflow), [Dagster](/connectors/pipeline/dagster), [Databricks Pipeline](/connectors/pipeline/databricks-pipeline), [Domo Pipeline](/connectors/pipeline/domo-pipeline), [Fivetran](/connectors/pipeline/fivetran), [Glue Pipeline](/connectors/pipeline/glue-pipeline), [NiFi](/connectors/pipeline/nifi), and [Spline](/connectors/pipeline/spline).
- **ML Model Services:** [MLflow](/connectors/ml-model/mlflow), and [Sagemaker](/connectors/ml-model/sagemaker).
- **Storage Service:** [Amazon S3](/connectors/storage/s3)
- **Metadata Services:** [Amundsen](/connectors/metadata/amundsen), and [Atlas](/connectors/metadata/atlas)
{%/note%}
Lets start with an example of fetching metadata from a database service, i.e., Snowflake.
- Start by creating a service connection by clicking on **Settings** from the left nav bar. Navigate to the **Services** section, and click on **Databases**. Click on **Add New Service**.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/connector1.jpg"
alt="Create a Service Connection"
caption="Create a Service Connection"
/%}
- Select the Database service of your choice. For example, Snowflake. Click **Next**.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/connector2.jpg"
alt="Select the Database Connector"
caption="Select the Database Connector"
/%}
- To configure Snowflake, enter a unique service name. Click **Next**.
- **Name:** No spaces allowed. Apart from letters and numbers, you can use _ - . & ( )
- **Description:** It is optional, but best to add documentation to improve data culture.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake1.png"
alt="Configure Snowflake"
caption="Configure Snowflake"
/%}
- Enter the **Connection Details**. The Connector documentation is available right within OpenMetadata in the right side panel. The connector details will differ based on the service selected. Users can add their credentials to create a service and further set up the workflows.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake2.png"
alt="Connection Details"
caption="Connection Details"
/%}
- Users can **Test the Connection** before creating the service. Test Connection checks for access, and also about what details can be ingested using the connection.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake3.png"
alt="Test the Connection"
caption="Test the Connection"
/%}
- The **Connection Status** will verify access to the service as well as to the data assets. Once the connection has been tested, you can save the details.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/testconnection1.png"
alt="Connection Successful"
caption="Connection Successful"
/%}
- Once the database service is created and the connections are established, admins can set up Pipelines to ingest all the source data into OpenMetadata.
- Clicking on **View Service** will navigate to the Database service page, where you can view the Databases, Ingestion, and Connection Details Tabs. You can also **Add the Metadata Ingestion** from the Ingestion tab.
- Or, you can directly start with **Adding Ingestion**.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake4.png"
alt="Snowflake Service Created"
caption="Snowflake Service Created"
/%}
{% note %}
**Tip:** In the Service page, the **Connection Tab** provides information on the connection details as well as details on what data can be ingested from the source using this connection.
{% /note %}
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake5.png"
alt="View Snowflake Service"
caption="View Snowflake Service"
/%}
- Click on **Add Ingestion** and enter the details to ingest metadata:
- **Name:** The name is randomly generated, and includes the Service Name, and a randomly generated text to create a unique name.
- **Database Filter Pattern:** to include or exclude certain databases. A database service has multiple databases, of which you can selectively ingest the required databases.
- **Schema Filter Pattern:** to include or exclude certain schemas. A database can have multiple schemas, of which you can selectively ingest the required schemas.
- **Table Filter Pattern:** Use the toggle options to:
- Use FQN for Filtering
- Include Views - to generate lineage
- Include Tags
- Enable Debug Log: We recommend enabling the debug log.
- Mark Deleted Tables, or
- Mark All Deleted Tables.
- **View Definition Parsing Timeout Limit:** The default is set to 300.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake6.png"
alt="Configure Metadata Ingestion"
caption="Configure Metadata Ingestion"
/%}
- **Schedule Metadata Ingestion** - Define when the metadata ingestion pipeline must run on a regular basis. Users can also use a **Custom Cron** expression.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/schedule.png"
alt="Schedule and Deploy Metadata Ingestion"
caption="Schedule and Deploy Metadata Ingestion"
/%}
After the ingestion pipeline has been created and deployed successfully, click on **View Service**. The **Ingestion Tab** will provide all the details for the recent runs, like if the pipeline is queued, running, failed, or successful. On hovering over the ingestion details, admin users can view the scheduling frequency, as well as the start and end times for the recent runs. Users can perform certain actions, like:
- **Run** the pipeline now.
- **Kill** to end all the currently running pipelines.
- **Redeploy:** When a service connection is setup, it fetches the data as per the access provided. If the connection credentials are changed at a later point in time, redeploying will fetch additional data with updated access, if any.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/view-service.png"
alt="View Service Ingestion"
caption="View Service Ingestion"
/%}
By connecting to a database service, you can ingest the databases, schemas, tables, and columns. In the Service page, the **Databases Tab** will display all the ingested databases. Users can further drilldown to view the **Schemas**, and **Tables**.
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake7.png"
alt="View Table Details"
caption="View Table Details"
/%}
{% note %}
**Note:** Once youve run a metadata ingestion pipeline, you can create separate pipelines to bring in [**Usage**](/connectors/ingestion/workflows/usage), [**Lineage**](/connectors/ingestion/workflows/lineage), [**dbt**](/connectors/ingestion/workflows/dbt), or to run [**Profiler**](/connectors/ingestion/workflows/profiler). To add ingestion pipelines, select the required type of ingestion and enter the required details.
{% /note %}
{% image
src="/images/v1.1.2/how-to-guides/quick-start-guide-for-admins/snowflake8.png"
alt="Add Ingestion Pipelines for Usage, Lineage, Profiler, and dbt"
caption="Add Ingestion Pipelines for Usage, Lineage, Profiler, and dbt"
/%}
Admin users can create, edit, or delete services. They can also view the connection details for the existing services.
{% note %}
**Pro Tip:** Refer to the [Best Practices for Metadata Ingestion](/connectors/ingestion/best-practices).
{% /note %}

View File

@ -0,0 +1,13 @@
---
title: Quick Start Guide for Admins
slug: /how-to-guides/quick-start-guide-for-admins
---
# Quick Start Guide for Admins
Admin users have access to manage all the data assets. They can manage all the functions to create, edit, or delete. Admins can manage Roles, Policies, Services, Notifications, Custom Properties, Data Insights and more. They can add other users, or create teams to onboard users. An organization can have multiple Admins so that separate Admins can effectively manage different teams and departments.
Get started with OpenMetadata with just **three quick and easy steps**.
1. [Ingest your Data from Multiple Sources.](/how-to-guides/quick-start-guide-for-admins/how-to-ingest-metadata)
2. [Create Teams](/how-to-guides/quick-start-guide-for-admins/teams-and-users).
3. Add Users to Start Collaborating on Data.

View File

@ -1,11 +1,11 @@
---
title: How To Organise Teams And users
slug: /how-to-guides/teams-and-users/how-to-organise-teams-and-users
title: How to Organise Teams and Users
slug: /how-to-guides/quick-start-guide-for-admins/teams-and-users/how-to-organise-teams-and-users
---
# How To Organise Teams And users
# How to Organise Teams and Users
## Team structure in OpenMetadata
## Team Structure in OpenMetadata
In OpenMetadata we have hierarchal team structure with `teamType` that can be `Organization`, `Business Unit`, `Division`, `Department`, and `Group` (default team type).
@ -25,7 +25,7 @@ alt="team-structure"
/%}
## How to change the team type
## How to Change the Team Type
Let's say you have team `Cloud_Infra` of type `Department` and you want to change it to the type `BusinessUnit`, you can easily do that through UI.

View File

@ -1,9 +1,13 @@
---
title: Team structure in OpenMetadata
slug: /how-to-guides/teams-and-users
title: Team Structure in OpenMetadata
slug: /how-to-guides/quick-start-guide-for-admins/teams-and-users
---
# Team structure in OpenMetadata
# Manage Teams and Users
OpenMetadatas versatile hierarchical team structure helps align with your organization's setup. Admins can mirror their organizational hierarchy by creating various team types. You can onboard new users to the relevant teams. An organization can have multiple Admins, so that different teams and departments can be effectively managed by separate Admins.
# Team Structure in OpenMetadata
In OpenMetadata we have hierarchal team structure with `teamType` that can be `Organization`, `Business Unit`, `Division`, `Department`, and `Group` (default team type).
@ -17,9 +21,7 @@ In OpenMetadata we have hierarchal team structure with `teamType` that can be `O
- `Group` is the last level of the team in the hierarchy. It can have only `Users` as children and not any other teams. It can have all the team types as parents. **It can have multiple parents**.
{% image
src="/images/v1.1.2/how-to-guides/teams-and-users/teams-structure.png"
src="/images/v1.1.2/how-to-guides/teams-and-users/teams.png"
alt="team-structure"
/%}
/%}

View File

@ -374,16 +374,16 @@ site_menu:
url: /connectors/dashboard/powerbi
- category: Connectors / Dashboard / PowerBI / Run Externally
url: /connectors/dashboard/powerbi/yaml
- category: Connectors / Dashboard / QuickSight
url: /connectors/dashboard/quicksight
- category: Connectors / Dashboard / QuickSight / Run Externally
url: /connectors/dashboard/quicksight/yaml
- category: Connectors / Dashboard / Qlik Sense
url: /connectors/dashboard/qliksense
- category: Connectors / Dashboard / Qlik Sense / Run Externally
url: /connectors/dashboard/qliksense/yaml
- category: Connectors / Dashboard / Qlik Sense / Export Certificates
url: /connectors/dashboard/qliksense/certificates
- category: Connectors / Dashboard / QuickSight
url: /connectors/dashboard/quicksight
- category: Connectors / Dashboard / QuickSight / Run Externally
url: /connectors/dashboard/quicksight/yaml
- category: Connectors / Dashboard / Redash
url: /connectors/dashboard/redash
- category: Connectors / Dashboard / Redash / Run Externally
@ -456,9 +456,9 @@ site_menu:
url: /connectors/pipeline/glue-pipeline
- category: Connectors / Pipeline / Glue Pipeline / Run Externally
url: /connectors/pipeline/glue-pipeline/yaml
- category: Connectors / Pipeline / Nifi
- category: Connectors / Pipeline / NiFi
url: /connectors/pipeline/nifi
- category: Connectors / Pipeline / Nifi / Run Externally
- category: Connectors / Pipeline / NiFi / Run Externally
url: /connectors/pipeline/nifi/yaml
- category: Connectors / Pipeline / Spline
url: /connectors/pipeline/spline
@ -571,16 +571,22 @@ site_menu:
color: violet-70
icon: openmetadata
- category: How to Guides / Quick Start Guide for Admins
url: /how-to-guides/quick-start-guide-for-admins
- category: How to Guides / Quick Start Guide for Admins / How to Ingest Metadata
url: /how-to-guides/quick-start-guide-for-admins/how-to-ingest-metadata
- category: How to Guides / Quick Start Guide for Admins / How to Ingest Metadata / How to Delete a Service Connection
url: /how-to-guides/quick-start-guide-for-admins/how-to-ingest-metadata/delete-service-connection
- category: How to Guides / Quick Start Guide for Admins / Manage Teams and Users
url: /how-to-guides/quick-start-guide-for-admins/teams-and-users
- category: How to Guides / Quick Start Guide for Admins / Manage Teams and Users / How to Organise Teams and Users
url: /how-to-guides/quick-start-guide-for-admins/teams-and-users/how-to-organise-teams-and-users
- category: How to Guides / CLI Ingestion with basic auth
url: /how-to-guides/cli-ingestion-with-basic-auth
- category: How to Guides / Feature configurations
url: /how-to-guides/feature-configurations
- category: How to Guides / Feature configurations / Bots
url: /how-to-guides/feature-configurations/bots
- category: How to Guides / Teams and Users
url: /how-to-guides/teams-and-users
- category: How to Guides / Teams and Users / How to Organise Teams and Users
url: /how-to-guides/teams-and-users/how-to-organise-teams-and-users
- category: How to Guides / How to add a custom property to an entity
url: /how-to-guides/how-to-add-custom-property-to-an-entity
- category: How to Guides / How to add Custom Logo

Binary file not shown.

After

Width:  |  Height:  |  Size: 457 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 218 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 212 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 76 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 93 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 86 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 644 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 83 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 112 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 473 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 451 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 139 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 128 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 749 KiB