[Docs] - Prepare 1.2 docs (#13706)

* Update workflow imports

* LineageRequest without desc

* Usage Workflow

* Usage Workflow

* 1.2 docs publish

* 1.2 docs publish
This commit is contained in:
Pere Miquel Brull 2023-10-25 16:30:51 +02:00 committed by GitHub
parent ecc03ccc89
commit 41a2aeb1af
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
841 changed files with 170 additions and 69 deletions

View File

@ -99,7 +99,7 @@ pip install openmetadata-managed-apis==x.y.z
## Deprecation Notice
- OpenMetadata only supports Python version 3.8 to 3.10.
- OpenMetadata only supports Python version 3.8 to 3.10. We will add support for 3.11 in the release 1.3.
## Breaking Changes for 1.2 Stable Release
@ -115,4 +115,57 @@ then there is no way to link a query to a service and the query will be removed.
- Domo Database, Dashboard and Pipeline renamed the `sandboxDomain` in favor of `instanceDomain`.
### Ingestion Framework Changes
We have reorganized the structure of the `Workflow` classes, which requires updated imports:
- **Metadata Workflow**
- From: `from metadata.ingestion.api.workflow import Workflow`
- To: `from metadata.workflow.metadata import MetadataWorkflow`
- **Lineage Workflow**
- From: `from metadata.ingestion.api.workflow import Workflow`
- To: `from metadata.workflow.metadata import MetadataWorkflow` (same as metadata)
- **Usage Workflow**
- From: `from metadata.ingestion.api.workflow import Workflow`
- To: `from metadata.workflow.usage import UsageWorkflow`
- **Profiler Workflow**
- From: `from metadata.profiler.api.workflow import ProfilerWorkflow`
- To: `from metadata.workflow.profiler import ProfilerWorkflow`
- **Data Quality Workflow**
- From: `from metadata.data_quality.api.workflow import TestSuiteWorkflow`
- To: `from metadata.workflow.data_quality import TestSuiteWorkflow`
- **Data Insights Workflow**
- From: `from metadata.data_insight.api.workflow import DataInsightWorkflow`
- To: `from metadata.workflow.data_insight import DataInsightWorkflow`
- **Elasticsearch Reindex Workflow**
- From: `from metadata.ingestion.api.workflow import Workflow`
- To: `from metadata.workflow.metadata import MetadataWorkflow` (same as metadata)
The `Workflow` class that you import can then be called as follows:
```python
from metadata.workflow.workflow_output_handler import print_status
workflow = workflow_class.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
print_status(workflow) # This method has been updated. Before it was `workflow.print_status()`
workflow.stop()
```
If you try to run your workflows externally and start noticing `ImportError`s, you will need to review the points above.
### Metadata CLI Changes
In 1.1.7 and below you could run the Usage Workflow as `metadata ingest -c <path to yaml>`. Now, the Usage Workflow
has its own command `metadata usage -c <path to yaml>`.
### Other Changes
- Pipeline Status are now timestamps in milliseconds.

View File

@ -709,7 +709,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -746,7 +747,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -717,7 +717,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -754,7 +755,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -665,7 +665,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -702,7 +703,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -682,7 +682,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -719,7 +720,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -741,7 +741,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -778,7 +779,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -664,7 +664,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -701,7 +702,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -750,7 +750,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -787,7 +788,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

View File

@ -638,7 +638,8 @@ Here we are also importing all the basic requirements to parse YAMLs, handle dat
import yaml
from datetime import timedelta
from airflow import DAG
from metadata.profiler.api.workflow import ProfilerWorkflow
from metadata.workflow.profiler import ProfilerWorkflow
from metadata.workflow.workflow_output_handler import print_status
try:
from airflow.operators.python import PythonOperator
@ -675,7 +676,7 @@ def metadata_ingestion_workflow():
workflow = ProfilerWorkflow.create(workflow_config)
workflow.execute()
workflow.raise_from_status()
workflow.print_status()
print_status(workflow)
workflow.stop()

Some files were not shown because too many files have changed in this diff Show More