Docs: Fixed 404 links (#16028)

This commit is contained in:
Shilpa Vernekar 2024-04-26 18:48:45 +05:30 committed by GitHub
parent 0695398a64
commit e6a2ec147e
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 2 additions and 2 deletions

View File

@ -93,7 +93,7 @@ Then, you can prepare `Run Configurations` to execute the ingestion as you would
{% image src="/images/v1.3/developers/contribute/build-code-and-run-tests/pycharm-run-config.png" alt="PyCharm run config" caption=" " /%}
Note that in the example we are preparing a configuration to run and test Superset. In order to understand how to run
ingestions via the CLI, you can refer to each connector's [docs](/connectors/dashboard/superset/cli).
ingestions via the CLI, you can refer to each connector's [docs](/connectors/dashboard/superset).
The important part is that we are not running a script, but rather a `module`: `metadata`. Based on this, we can work as
we would usually do with the CLI for any ingestion, profiler, or test workflow.

View File

@ -72,7 +72,7 @@ After clicking Next, you will be redirected to the Scheduling form. This will be
## dbt Ingestion
We can also generate lineage through [dbt ingestion](/connectors/ingestion/workflows/dbt/ingest-dbt-ui). The dbt workflow can fetch queries that carry lineage information. For a dbt ingestion pipeline, the path to the Catalog and Manifest files must be specified. We also fetch the column level lineage through dbt.
We can also generate lineage through [dbt ingestion](/connectors/ingestion/workflows/dbt). The dbt workflow can fetch queries that carry lineage information. For a dbt ingestion pipeline, the path to the Catalog and Manifest files must be specified. We also fetch the column level lineage through dbt.
You can learn more about [lineage ingestion here](/connectors/ingestion/lineage).