Update profiler dq doc (#12182)

* doc: update documentation for profiler and dq

* doc: renamed image path from 1.0.0 to 1.1.0
This commit is contained in:
Teddy 2023-06-27 15:32:38 +02:00 committed by GitHub
parent 2902b0e28a
commit a6306e1ca4
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
34 changed files with 343 additions and 218 deletions

View File

@ -0,0 +1,208 @@
---
title: Custom Tests
slug: /connectors/ingestion/workflows/data-quality/custom-tests
---
# Adding Custom Tests
While OpenMetadata provides out of the box tests, you may want to write your test results from your own custom quality test suite or define your own data quality tests to be ran inside OpenMetada. This is very easy to do using the API and our Python SDK.
### Step 1: Creating a `TestDefinition`
First, you'll need to create a Test Definition for your test. You can use the following endpoint `/api/v1/dataQuality/testDefinition` using a POST protocol to create your Test Definition. You will need to pass the following data in the body your request at minimum.
```json
{
"description": "<you test definition description>",
"entityType": "<TABLE or COLUMN>",
"name": "<your_test_name>",
"testPlatforms": ["<any of OpenMetadata,GreatExpectations, dbt, Deequ, Soda, Other>"],
"parameterDefinition": [
{
"name": "<name>"
},
{
"name": "<name>"
}
]
}
```
Here is a complete CURL request
```bash
curl --request POST 'http://localhost:8585/api/v1/dataQuality/testDefinitions' \
--header 'Content-Type: application/json' \
--data-raw '{
"description": "A demo custom test",
"entityType": "TABLE",
"name": "demo_test_definition",
"testPlatforms": ["Soda", "dbt"],
"parameterDefinition": [{
"name": "ColumnOne"
}]
}'
```
Make sure to keep the `UUID` from the response as you will need it to create the Test Case.
**Important:** If you want to have the test definition available through OpenMetadata UI, `testPlatforms` need to include `OpenMetadata`. It will also require extra work that we'll cover below in Step 5. If you are just looking to create new test definition executable through OpenMetadata UI, you can skip ahead to Step 5.
### Step 2: Creating a `TestSuite`
You'll also need to create a Test Suite for your Test Case -- note that you can also use an existing one if you want to. You can use the following endpoint `/api/v1/dataQuality/testSuites/executable` using a POST protocol to create your Test Definition. You will need to pass the following data in the body your request at minimum.
```json
{
"name": "<test_suite_name>",
"description": "<test suite description>",
"executableEntityReference": "<entityFQN>"
}
```
Here is a complete CURL request
```bash
curl --request POST 'http://localhost:8585/api/v1/dataQuality/testSuites/executable' \
--header 'Content-Type: application/json' \
--data-raw '{
"name": "<test_suite_name>",
"description": "<test suite description>",
"executableEntityReference": "<entityFQN>"
}'
```
Make sure to keep the `UUID` from the response as you will need it to create the Test Case.
### Step 3: Creating a `TestCase`
Once you have your Test Definition created you can create a Test Case -- which is a specification of your Test Definition. You can use the following endpoint `/api/v1/dataQuality/testCases` using a POST protocol to create your Test Case. You will need to pass the following data in the body your request at minimum.
```json
{
"entityLink": "<#E::table::fqn> or <#E::table::fqn::columns::column name>",
"name": "<test_case_name>",
"testDefinition": {
"id": "<test definition UUID>",
"type": "testDefinition"
},
"testSuite": {
"id": "<test suite UUID>",
"type": "testSuite"
}
}
```
**Important:** for `entityLink` make sure to include the starting and ending `<>`
Here is a complete CURL request
```bash
curl --request POST 'http://localhost:8585/api/v1/dataQuality/testCases' \
--header 'Content-Type: application/json' \
--data-raw '{
"entityLink": "<#E::table::local_redshift.dev.dbt_jaffle.customers>",
"name": "custom_test_Case",
"testDefinition": {
"id": "1f3ce6f5-67be-45db-8314-2ee42d73239f",
"type": "testDefinition"
},
"testSuite": {
"id": "3192ed9b-5907-475d-a623-1b3a1ef4a2f6",
"type": "testSuite"
},
"parameterValues": [
{
"name": "colName",
"value": 10
}
]
}'
```
Make sure to keep the `UUID` from the response as you will need it to create the Test Case.
### Step 4: Writing `TestCaseResults` (Optional - if not executing test case through OpenMetadata UI)
Once you have your Test Case created you can write your results to it. You can use the following endpoint `/api/v1/dataQuality/testCases/{test FQN}/testCaseResult` using a PUT protocol to add Test Case Results. You will need to pass the following data in the body your request at minimum.
```json
{
"result": "<result message>",
"testCaseStatus": "<Success or Failed or Aborted>",
"timestamp": <Unix timestamp>,
"testResultValue": [
{
"value": "<value>"
}
]
}
```
Here is a complete CURL request
```bash
curl --location --request PUT 'http://localhost:8585/api/v1/dataQuality/testCases/local_redshift.dev.dbt_jaffle.customers.custom_test_Case/testCaseResult' \
--header 'Content-Type: application/json' \
--data-raw '{
"result": "found 1 values expected n",
"testCaseStatus": "Success",
"timestamp": 1662129151,
"testResultValue": [{
"value": "10"
}]
}'
```
You will now be able to see your test in the Test Suite or the table entity.
### Step 5: Making Custom Test Case Available Through OpenMetadata UI (Optional)
OpenMetadata offers the flexibility to user to create custom test cases that will be executabl through the user interface. To accomplish our goal, we'll be leveraging OpenMetadata namespace `data_quality` submodole.
#### A. Create Your Namespace Package
The first in creating your owne executable test case is to create a package where you'll be writting the logic to process the tests. Your package should have a minimum the below structure
```
metadata/
setup.py
```
To add table and column level test cases to SQLAlchemy sources you will place your test respectively in:
- `metadata/data_quality/validations/table/sqlalchemy/<yourTest>.py`
- `metadata/data_quality/validations/column/sqlalchemy/<yourTest>.py`
`<yourTest>` should match the name of your test definition in Step 1.
**Important:** You will need to add an `__init__.py` file in every folder and these `__init__.py` should have the below line
```python
__path__ = __import__('pkgutil').extend_path(__path__, __name__)
```
#### B. Create your Test Class
Once you have created the different, you can add the logic in your `<yourTest>.py` file. You will need to create a class named `<YourTest>Validator` that will inherit from `BaseTestValidator`. If you need to, you can also inherit from `SQAValidatorMixin` -- this will give you access to additional methods out of the box. Once completed, you will simply need to implement the `run_validation` class. This method should return a `TestCaseResult` object. You can find a full implementation [here](https://github.com/open-metadata/openmetadata-demo/tree/main/custom-om-test) where we create an entropy test.
```python
class ColumnEntropyToBeBetweenValidator(BaseTestValidator):
"""Implements custom test validator for OpenMetadata.
Args:
BaseTestValidator (_type_): inherits from BaseTestValidator
"""
def run_validation(self) -> TestCaseResult:
"""Run test validation"""
```
#### C. `pip` Install Your Package
Once you have completed A) and B) you should only need to `pip install` your package in the environment where openmetadata python SDK is install.
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/custom-test-definition.png"
alt="Create test case"
caption="Create test case"
/%}
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/custom-test-result.png"
alt="Create test case"
caption="Create test case"
/%}

View File

@ -36,7 +36,7 @@ This section will show you how to configure and run Data Quality pipelines with
## Main Concepts ## Main Concepts
### Test Suite ### Test Suite
Test Suites are containers allowing you to group related Test Cases together. Once configured, a Test Suite can easily be deployed to execute all the Test Cases it contains. Test Suites are logical container allowing you to group related Test Cases together from different tables.
### Test Definition ### Test Definition
Test Definitions are generic tests definition elements specific to a test such as: Test Definitions are generic tests definition elements specific to a test such as:
@ -47,54 +47,80 @@ Test Definitions are generic tests definition elements specific to a test such a
### Test Cases ### Test Cases
Test Cases specify a Test Definition. It will define what condition a test must meet to be successful (e.g. `max=n`, etc.). One Test Definition can be linked to multiple Test Cases. Test Cases specify a Test Definition. It will define what condition a test must meet to be successful (e.g. `max=n`, etc.). One Test Definition can be linked to multiple Test Cases.
## Adding Tests Through the UI ## Adding Test Cases to an Entity
Tests cases are actual test that will be ran and executed against your entity. This is where you will define the excution time and logic of these tests
**Note:** you will need to make sure you have the right permission in OpenMetadata to create a test.
## Step 1: Creating a Test Case
Navigate to the entity you want to add a test (we currently support quality test only for database entity). Go to `Profiler & Data Quality` tab. From there, click on the `Add Test` button in the upper right corner and select the type of test you want to implement
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/add-test-case.png"
alt="Write your first test"
caption="Write your first test"
/%}
## Step 2: Select the Test Definition
Select the type of test you want to run and set the parameters (if any) for your test case. If you have select a `column` test, you will need to select which column you want to execute your test against. Give it a name and then submit it.
**Note:** if you have a profiler workflow running, you will be able to visualize some context around your column or table data.
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/add-test-defintion.png"
alt="Write your first test"
caption="Write your first test"
/%}
## Step 3: Set an Execution Schedule (Optional)
If it is the first test you are creating for this entity, you'll need to set an execution time. click on `Add Ingestion` button and select a schedule. Note that the time is shown in UTC.
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/add-ingestion.png"
alt="Write your first test"
caption="Write your first test"
/%}
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/ingestion-page.png"
alt="Write your first test"
caption="Write your first test"
/%}
## Adding Test Suites Through the UI
Test Suites are logical container allowing you to group related Test Cases together from different tables.
**Note:** you will need to make sure you have the right permission in OpenMetadata to create a test. **Note:** you will need to make sure you have the right permission in OpenMetadata to create a test.
### Step 1: Creating a Test Suite ### Step 1: Creating a Test Suite
From your table service click on the `profiler` tab. From there you will be able to create table tests by clicking on the purple background `Add Test` top button or column tests by clicking on the white background `Add Test` button. From the vertical navigation bar, click on `Quality` and navigate to the `By Test Suites` tab. From there click on `Add Test Suite` button on the top right corner.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/profiler-tab-view.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/profiler-tab-view.png"
alt="Write your first test" alt="Write your first test"
caption="Write your first test" caption="Write your first test"
/%} /%}
On the next page you will be able to either select an existing Test Suite or Create a new one. If you select an existing one your Test Case will automatically be added to the Test Suite On the next page, enter the name and description (optional) of your test suite.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/test-suite-page.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/test-suite-page.png"
alt="Create test suite" alt="Create test suite"
caption="Create test suite" caption="Create test suite"
/%} /%}
### Step 2: Create a Test Case ### Step 2: Add Test Cases
On the next page, you will create a Test Case. You will need to select a Test Definition from the drop down menu and specify the parameters of your Test Case. On the next page, you will be able to add existing test cases from different entity to your test suite. This allows you to group together test cases from different entities
**Note:** Test Case name needs to be unique across the whole platform. A warning message will show if your Test Case name is not unique. **Note:** Test Case name needs to be unique across the whole platform. A warning message will show if your Test Case name is not unique.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/test-case-page.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/test-case-page.png"
alt="Create test case" alt="Create test case"
caption="Create test case" caption="Create test case"
/%} /%}
### Step 3: Add Ingestion Workflow
If you have created a new test suite you will see a purple background `Add Ingestion` button after clicking `submit`. This will allow you to schedule the execution of your Test Suite. If you have selected an existing Test Suite you are all set.
After clicking `Add Ingestion` you will be able to select an execution schedule for your Test Suite (note that you can edit this later). Once you have selected the desired scheduling time, click submit and you are all set.
{% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/ingestion-page.png"
alt="Create ingestion workflow"
caption="Create ingestion workflow"
/%}
## Adding Tests with the YAML Config ## Adding Tests with the YAML Config
When creating a JSON config for a test workflow the source configuration is very simple. When creating a JSON config for a test workflow the source configuration is very simple.
```yaml ```yaml
@ -104,29 +130,41 @@ source:
sourceConfig: sourceConfig:
config: config:
type: TestSuite type: TestSuite
entityFullyQualifiedName: <entityFqn>
``` ```
The only section you need to modify here is the `serviceName` key. Note that this name needs to be unique across OM platform Test Suite name. The only sections you need to modify here are the `serviceName` (this name needs to be unique) and `entityFullyQualifiedName` (the entity for which we'll be executing tests against) keys.
Once you have defined your source configuration you'll need to define te processor configuration. Once you have defined your source configuration you'll need to define te processor configuration.
```yaml ```yaml
processor: processor:
type: "orm-test-runner" type: "orm-test-runner"
config: config:
testSuites: forceUpdate: <false|true>
- name: [test_suite_name]
description: [test suite description]
testCases: testCases:
- name: [test_case_name] - name: <testCaseName>
description: [test case description] testDefinitionName: columnValueLengthsToBeBetween
testDefinitionName: [test definition name*] columnName: <columnName>
entityLink: ["<#E::table::fqn> or <#E::table::fqn::columns::column_name>"]
parameterValues: parameterValues:
- name: [column parameter name] - name: minLength
value: [value] value: 10
- ... - name: maxLength
value: 25
- name: <testCaseName>
testDefinitionName: tableRowCountToEqual
parameterValues:
- name: value
value: 10
``` ```
The processor type should be set to ` "orm-test-runner"`. For accepted test definition names and parameter value names refer to the [tests page](/connectors/ingestion/workflows/data-quality/tests). The processor type should be set to ` "orm-test-runner"`. For accepted test definition names and parameter value names refer to the [tests page](/connectors/ingestion/workflows/data-quality/tests).
### Key referece:
- `forceUpdate`: if the test case exists (base on the test case name) for the entity, implements the strategy to follow when running the test (i.e. whether or not to update parameters)
- `testCases`: list of test cases to execute against the entity referenced
- `name`: test case name
- `testDefinitionName`: test definition
- `columnName`: only applies to column test. The name of the column to run the test against
- `parameterValues`: parameter values of the test
`sink` and `workflowConfig` will have the same settings than the ingestion and profiler workflow. `sink` and `workflowConfig` will have the same settings than the ingestion and profiler workflow.
@ -139,23 +177,26 @@ source:
sourceConfig: sourceConfig:
config: config:
type: TestSuite type: TestSuite
entityFullyQualifiedName: MySQL.default.openmetadata_db.tag_usage
processor: processor:
type: "orm-test-runner" type: "orm-test-runner"
config: config:
testSuites: forceUpdate: false
- name: test_suite_one
description: this is a test testSuite to confirm test suite workflow works as expected
testCases: testCases:
- name: a_column_test - name: column_value_lenght_tagFQN
description: A test case testDefinitionName: columnValueLengthsToBeBetween
testDefinitionName: columnValuesToBeBetween columnName: tagFQN
entityLink: "<#E::table::local_redshift.dev.dbt_jaffle.customers::columns::number_of_orders>"
parameterValues: parameterValues:
- name: minValue - name: minLength
value: 2 value: 10
- name: maxValue - name: maxLength
value: 20 value: 25
- name: table_row_count_test
testDefinitionName: tableRowCountToEqual
parameterValues:
- name: value
value: 10
sink: sink:
type: metadata-rest type: metadata-rest
@ -235,198 +276,72 @@ Note how we are using the `TestSuiteWorkflow` class to load and execute the test
configurations specified above. configurations specified above.
## How to Visualize Test Results ## How to Visualize Test Results
### From the Test Suite View ### From the Quality Page
From the home page click on the Test Suite menu in the left pannel. From the home page click on the `Quality` menu item on the vertical navigation. This will bring you to the quality page where you'll be able to see your test cases either by:
- entity
- test suite
- test cases
If you want to look at your tests grouped by Test Suites, navigate to the `By Test Suites` tab. This will bring you to the Test Suite page where you can select a specific Test Suite.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/test-suite-home-page.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/test-suite-home-page.png"
alt="Test suite home page" alt="Test suite home page"
caption="Test suite home page" caption="Test suite home page"
/%} /%}
This will bring you to the Test Suite page where you can select a specific Test Suite.
{% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/test-suite-landing.png"
alt="Test suite landing page"
caption="Test suite landing page"
/%}
From there you can select a Test Suite and visualize the results associated with this specific Test Suite. From there you can select a Test Suite and visualize the results associated with this specific Test Suite.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/test-suite-results.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/test-suite-results.png"
alt="Test suite results page" alt="Test suite results page"
caption="Test suite results page" caption="Test suite results page"
/%} /%}
### From a Table Entity ### From a Table Entity
Navigate to your table and click on the `profiler` tab. From there you'll be able to see test results at the table or column level. Navigate to your table and click on the `profiler & Data Quality` tab. From there you'll be able to see test results at the table or column level.
#### Table Level Test Results #### Table Level Test Results
In the top pannel, click on the white background `Data Quality` button. This will bring you to a summary of all your quality tests at the table level In the top pannel, click on the white background `Data Quality` button. This will bring you to a summary of all your quality tests at the table level
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/table-results-entity.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/table-results-entity.png"
alt="Test suite results table" alt="Test suite results table"
caption="Test suite results table" caption="Test suite results table"
/%} /%}
## Test Case Resolution Workflow
#### Column Level Test Results In v1.1.0 we introduce the ability for user to flag the resolution status of failed test cases. When a test case fail, it will automatically be marked as new. It indicates that a new failure has happened.
On the profiler page, click on a specific column name. This will bring you to a new page where you can click the white background `Quality Test` button to see all the tests results related to your column.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/data-quality/colum-level-test-results.png" src="/images/v1.1.0/features/ingestion/workflows/data-quality/resolution-workflow-new.png"
alt="Test suite results table" alt="Test suite results table"
caption="Test suite results table" caption="Test suite results table"
/%} /%}
The next step for a user is to mark the new failure as `ack` (acknowledged) signifying to users that someone is looking into test failure resolution. When hovering over the resolution status user will be able to see the time (UTC) and the user who acknowledge the failure
## Adding Custom Tests {% image
While OpenMetadata provides out of the box tests, you may want to write your test results from your own custom quality test suite. This is very easy to do using the API. src="/images/v1.1.0/features/ingestion/workflows/data-quality/resolution-workflow-ack-form.png"
### Creating a `TestDefinition` alt="Test suite results table"
First, you'll need to create a Test Definition for your test. You can use the following endpoint `/api/v1/testDefinition` using a POST protocol to create your Test Definition. You will need to pass the following data in the body your request at minimum. caption="Test suite results table"
/%}
```json {% image
{ src="/images/v1.1.0/features/ingestion/workflows/data-quality/resolution-workflow-ack.png"
"description": "<you test definition description>", alt="Test suite results table"
"entityType": "<TABLE or COLUMN>", caption="Test suite results table"
"name": "<your_test_name>", /%}
"testPlatforms": ["<any of OpenMetadata,GreatExpectations, dbt, Deequ, Soda, Other>"],
"parameterDefinition": [
{
"name": "<name>"
},
{
"name": "<name>"
}
]
}
```
Here is a complete CURL request
```bash
curl --request POST 'http://localhost:8585/api/v1/testDefinition' \
--header 'Content-Type: application/json' \
--data-raw '{
"description": "A demo custom test",
"entityType": "TABLE",
"name": "demo_test_definition",
"testPlatforms": ["Soda", "dbt"],
"parameterDefinition": [{
"name": "ColumnOne"
}]
}'
```
Make sure to keep the `UUID` from the response as you will need it to create the Test Case.
### Creating a `TestSuite`
You'll also need to create a Test Suite for your Test Case -- note that you can also use an existing one if you want to. You can use the following endpoint `/api/v1/testSuite` using a POST protocol to create your Test Definition. You will need to pass the following data in the body your request at minimum.
```json
{
"name": "<test_suite_name>",
"description": "<test suite description>"
}
```
Here is a complete CURL request
```bash
curl --request POST 'http://localhost:8585/api/v1/testSuite' \
--header 'Content-Type: application/json' \
--data-raw '{
"name": "<test_suite_name>",
"description": "<test suite description>"
}'
```
Make sure to keep the `UUID` from the response as you will need it to create the Test Case.
### Creating a `TestCase`
Once you have your Test Definition created you can create a Test Case -- which is a specification of your Test Definition. You can use the following endpoint `/api/v1/testCase` using a POST protocol to create your Test Case. You will need to pass the following data in the body your request at minimum.
```json
{
"entityLink": "<#E::table::fqn> or <#E::table::fqn::columns::column name>",
"name": "<test_case_name>",
"testDefinition": {
"id": "<test definition UUID>",
"type": "testDefinition"
},
"testSuite": {
"id": "<test suite UUID>",
"type": "testSuite"
}
}
```
**Important:** for `entityLink` make sure to include the starting and ending `<>`
Here is a complete CURL request
```bash
curl --request POST 'http://localhost:8585/api/v1/testCase' \
--header 'Content-Type: application/json' \
--data-raw '{
"entityLink": "<#E::table::local_redshift.dev.dbt_jaffle.customers>",
"name": "custom_test_Case",
"testDefinition": {
"id": "1f3ce6f5-67be-45db-8314-2ee42d73239f",
"type": "testDefinition"
},
"testSuite": {
"id": "3192ed9b-5907-475d-a623-1b3a1ef4a2f6",
"type": "testSuite"
},
"parameterValues": [
{
"name": "colName",
"value": 10
}
]
}'
```
Make sure to keep the `UUID` from the response as you will need it to create the Test Case.
### Writing `TestCaseResults`
Once you have your Test Case created you can write your results to it. You can use the following endpoint `/api/v1/testCase/{test FQN}/testCaseResult` using a PUT protocol to add Test Case Results. You will need to pass the following data in the body your request at minimum.
```json
{
"result": "<result message>",
"testCaseStatus": "<Success or Failed or Aborted>",
"timestamp": <Unix timestamp>,
"testResultValue": [
{
"value": "<value>"
}
]
}
```
Here is a complete CURL request
```bash
curl --location --request PUT 'http://localhost:8585/api/v1/testCase/local_redshift.dev.dbt_jaffle.customers.custom_test_Case/testCaseResult' \
--header 'Content-Type: application/json' \
--data-raw '{
"result": "found 1 values expected n",
"testCaseStatus": "Success",
"timestamp": 1662129151,
"testResultValue": [{
"value": "10"
}]
}'
```
You will now be able to see your test in the Test Suite or the table entity.
Then user are able to mark a test as `resolved`. We made it mandatory for users to 1) select a reason and 2) add a comment when resolving failed test so that knowdledge can be maintain inside the platform.
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/resolution-workflow-resolved-form.png.png"
alt="Test suite results table"
caption="Test suite results table"
/%}
{% image
src="/images/v1.1.0/features/ingestion/workflows/data-quality/resolution-workflow-resolved.png.png"
alt="Test suite results table"
caption="Test suite results table"
/%}

View File

@ -13,14 +13,14 @@ After the metadata ingestion has been done correctly, we can configure and deplo
This Pipeline will be in charge of feeding the Profiler tab of the Table Entity, as well as running any tests configured in the Entity. This Pipeline will be in charge of feeding the Profiler tab of the Table Entity, as well as running any tests configured in the Entity.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/profiler-summary-table.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/profiler-summary-table.png"
alt="Table profile summary page" alt="Table profile summary page"
caption="Table profile summary page" caption="Table profile summary page"
/%} /%}
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/profiler-summary-colomn.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/profiler-summary-colomn.png"
alt="Column profile summary page" alt="Column profile summary page"
caption="Column profile summary page" caption="Column profile summary page"
/%} /%}
@ -31,7 +31,7 @@ This Pipeline will be in charge of feeding the Profiler tab of the Table Entity,
From the Service Page, go to the Ingestions tab to add a new ingestion and click on Add Profiler Ingestion. From the Service Page, go to the Ingestions tab to add a new ingestion and click on Add Profiler Ingestion.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/add-profiler-workflow.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/add-profiler-workflow.png"
alt="Add a profiler service" alt="Add a profiler service"
caption="Add a profiler service" caption="Add a profiler service"
/%} /%}
@ -41,7 +41,7 @@ From the Service Page, go to the Ingestions tab to add a new ingestion and click
Here you can enter the Profiler Ingestion details. Here you can enter the Profiler Ingestion details.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/configure-profiler-workflow.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/configure-profiler-workflow.png"
alt="Set profiler configuration" alt="Set profiler configuration"
caption="Set profiler configuration" caption="Set profiler configuration"
/%} /%}
@ -89,13 +89,13 @@ After clicking Next, you will be redirected to the Scheduling form. This will be
Once you have created your profiler you can adjust some behavior at the table level by going to the table and clicking on the profiler tab Once you have created your profiler you can adjust some behavior at the table level by going to the table and clicking on the profiler tab
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/accessing-table-profile-settings.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/accessing-table-profile-settings.png"
alt="table profile settings" alt="table profile settings"
caption="table profile settings" caption="table profile settings"
/%} /%}
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/table-profile-summary-view.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/table-profile-summary-view.png"
alt="table profile settings" alt="table profile settings"
caption="table profile settings" caption="table profile settings"
/%} /%}

View File

@ -38,7 +38,7 @@ Returns the number of columns in the Table.
System metrics provide information related to DML operations performed on the table. These metrics present a concise view of your data freshness. In a typical data processing flow tables are updated at a certain frequency. Table freshness will be monitored by confirming a set of operations has been performed against the table. To increase trust in your data assets, OpenMetadata will monitor the `INSERT`, `UPDATE` and `DELETE` operations performed against your table to showcase 2 metrics related to freshness (see below for more details). With this information, you are able to see when a specific operation was last perform and how many rows it affected. System metrics provide information related to DML operations performed on the table. These metrics present a concise view of your data freshness. In a typical data processing flow tables are updated at a certain frequency. Table freshness will be monitored by confirming a set of operations has been performed against the table. To increase trust in your data assets, OpenMetadata will monitor the `INSERT`, `UPDATE` and `DELETE` operations performed against your table to showcase 2 metrics related to freshness (see below for more details). With this information, you are able to see when a specific operation was last perform and how many rows it affected.
{% image {% image
src="/images/v1.0.0/features/ingestion/workflows/profiler/profiler-freshness-metrics.png" src="/images/v1.1.0/features/ingestion/workflows/profiler/profiler-freshness-metrics.png"
alt="table profile freshness metrics" alt="table profile freshness metrics"
caption="table profile freshness metrics" caption="table profile freshness metrics"
/%} /%}

View File

@ -633,6 +633,8 @@ site_menu:
url: /connectors/ingestion/workflows/data-quality url: /connectors/ingestion/workflows/data-quality
- category: Connectors / Ingestion / Workflows / Data Quality / Tests - category: Connectors / Ingestion / Workflows / Data Quality / Tests
url: /connectors/ingestion/workflows/data-quality/tests url: /connectors/ingestion/workflows/data-quality/tests
- category: Connectors / Ingestion / Workflows / Data Quality / Custom Tests
url: /connectors/ingestion/workflows/data-quality/custom-tests
- category: Connectors / Ingestion / Lineage - category: Connectors / Ingestion / Lineage
url: /connectors/ingestion/lineage url: /connectors/ingestion/lineage
- category: Connectors / Ingestion / Lineage / Edit Data Lineage Manually - category: Connectors / Ingestion / Lineage / Edit Data Lineage Manually

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 115 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 58 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 64 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 19 KiB

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 66 KiB

After

Width:  |  Height:  |  Size: 56 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 101 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 106 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 61 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 91 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 120 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 39 KiB

After

Width:  |  Height:  |  Size: 79 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 45 KiB

After

Width:  |  Height:  |  Size: 93 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 40 KiB

After

Width:  |  Height:  |  Size: 69 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 70 KiB

After

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 58 KiB

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 54 KiB

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 326 KiB

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 48 KiB

After

Width:  |  Height:  |  Size: 60 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 95 KiB

After

Width:  |  Height:  |  Size: 135 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 96 KiB

After

Width:  |  Height:  |  Size: 122 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

After

Width:  |  Height:  |  Size: 28 KiB