mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-09-25 08:50:18 +00:00
Fix some links on ingestion docs (#15417)
This commit is contained in:
parent
4a54cfa7cc
commit
d7ac1812d7
@ -102,15 +102,15 @@ It depends on where and how the Hive / Rest Catalog is setup and where the Icebe
|
||||
|
||||
- **Warehouse Location (Optional)**: Custom Warehouse Location. Most Catalogs already have the Warehouse Location defined properly and this shouldn't be needed. In case of a custom implementation you can pass the location here.
|
||||
|
||||
|
||||
**For example**: 's3://my-bucket/warehouse/'
|
||||
|
||||
- **Ownership Property**: Table property to look for the Owner. It defaults to 'owner'.
|
||||
|
||||
The Owner should be the same e-mail set on the OpenMetadata user/group.
|
||||
|
||||
**File System**
|
||||
- **Local**:
|
||||
#### **File System**
|
||||
|
||||
- **Local**
|
||||
- [**AWS Credentials**](#aws-credentials)
|
||||
- [**Azure Credentials**](#azure-credentials)
|
||||
|
||||
|
@ -23,8 +23,9 @@ Configure and schedule Amundsen metadata and profiler workflows from the OpenMet
|
||||
|
||||
## Requirements
|
||||
|
||||
Before this, you must ingest the database / messaging service you want to get metadata for.
|
||||
For more details click [here](/connectors/metadata/amundsen#create-database-service)
|
||||
Before this, you must create the service you want to get metadata for.
|
||||
You can learn how to do it by folowing the initial part of the Connector documentation for the service.
|
||||
You can find the connectors list [here](/connectors).
|
||||
|
||||
### Python Requirements
|
||||
|
||||
|
@ -304,8 +304,7 @@ Let's remark the differences between `git-sync` and what we want to achieve by i
|
||||
|
||||
Then, should you use `git-sync`?
|
||||
|
||||
- If you have an existing Airflow instance, and you want to build and maintain your own ingestion DAGs ([example](https://docs.open-metadata.org/v1.0.0/connectors/database/snowflake/airflow#2.-prepare-the-ingestion-dag)),
|
||||
then you can go for it.
|
||||
- If you have an existing Airflow instance, and you want to build and maintain your own ingestion DAGs then you can go for it. Check a DAG example [here](/deployment/ingestion/external/airflow#example).
|
||||
- If instead, you want to use the full deployment process from OpenMetadata, `git-sync` would not be the right tool, since the DAGs won't be backed up by Git, but rather created from OpenMetadata. Note that if anything
|
||||
would to happen where you might lose the Airflow volumes, etc. You can just redeploy the DAGs from OpenMetadata.
|
||||
|
||||
|
@ -748,7 +748,7 @@ site_menu:
|
||||
- category: Connectors / Metadata / Atlas
|
||||
url: /connectors/metadata/atlas
|
||||
- category: Connectors / Metadata / Atlas / Run Externally
|
||||
url: /connectors/metadata/atlas/external
|
||||
url: /connectors/metadata/atlas/yaml
|
||||
- category: Connectors / Metadata / Alation
|
||||
url: /connectors/metadata/alation
|
||||
- category: Connectors / Metadata / Alation / Run Externally
|
||||
|
Loading…
x
Reference in New Issue
Block a user