diff --git a/openmetadata-docs/content/connectors/database/datalake/airflow.md b/openmetadata-docs/content/connectors/database/datalake/airflow.md index fa881a339a4..d8a8949b9ec 100644 --- a/openmetadata-docs/content/connectors/database/datalake/airflow.md +++ b/openmetadata-docs/content/connectors/database/datalake/airflow.md @@ -21,6 +21,29 @@ To deploy OpenMetadata, check the Deployment guides. To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment. +** S3 Permissions ** + +

To execute metadata extraction AWS account should have enough access to fetch required data. The Bucket Policy in AWS requires at least these permissions:

+ +```json +{ + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": [ + "s3:GetObject", + "s3:ListBucket" + ], + "Resource": [ + "arn:aws:s3:::", + "arn:aws:s3:::/*" + ] + } + ] +} +``` + ### Python Requirements To run the Datalake ingestion, you will need to install: diff --git a/openmetadata-docs/content/connectors/database/datalake/cli.md b/openmetadata-docs/content/connectors/database/datalake/cli.md index 63e7ad64258..c7b4cf553f7 100644 --- a/openmetadata-docs/content/connectors/database/datalake/cli.md +++ b/openmetadata-docs/content/connectors/database/datalake/cli.md @@ -21,6 +21,29 @@ To deploy OpenMetadata, check the Deployment guides. To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment. +** S3 Permissions ** + +

To execute metadata extraction AWS account should have enough access to fetch required data. The Bucket Policy in AWS requires at least these permissions:

+ +```json +{ + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": [ + "s3:GetObject", + "s3:ListBucket" + ], + "Resource": [ + "arn:aws:s3:::", + "arn:aws:s3:::/*" + ] + } + ] +} +``` + ### Python Requirements To run the Datalake ingestion, you will need to install: diff --git a/openmetadata-docs/content/connectors/database/datalake/index.md b/openmetadata-docs/content/connectors/database/datalake/index.md index f7d8be40e31..294561c153b 100644 --- a/openmetadata-docs/content/connectors/database/datalake/index.md +++ b/openmetadata-docs/content/connectors/database/datalake/index.md @@ -40,6 +40,29 @@ To deploy OpenMetadata, check the Deployment guides. To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment. +** S3 Permissions ** + +

To execute metadata extraction AWS account should have enough access to fetch required data. The Bucket Policy in AWS requires at least these permissions:

+ +```json +{ + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": [ + "s3:GetObject", + "s3:ListBucket" + ], + "Resource": [ + "arn:aws:s3:::", + "arn:aws:s3:::/*" + ] + } + ] +} +``` + ## Metadata Ingestion ### 1. Visit the Services Page diff --git a/openmetadata-docs/content/connectors/database/snowflake/airflow.md b/openmetadata-docs/content/connectors/database/snowflake/airflow.md index 3e4a226b69e..5059af66678 100644 --- a/openmetadata-docs/content/connectors/database/snowflake/airflow.md +++ b/openmetadata-docs/content/connectors/database/snowflake/airflow.md @@ -39,6 +39,7 @@ pip3 install "openmetadata-ingestion[snowflake-usage]" +- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas. - While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table. For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database). - If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`. diff --git a/openmetadata-docs/content/connectors/database/snowflake/cli.md b/openmetadata-docs/content/connectors/database/snowflake/cli.md index dbcd41cceb9..dc8737651a0 100644 --- a/openmetadata-docs/content/connectors/database/snowflake/cli.md +++ b/openmetadata-docs/content/connectors/database/snowflake/cli.md @@ -39,6 +39,7 @@ pip3 install "openmetadata-ingestion[snowflake-usage]" +- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas. - While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table. For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database). - If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`. diff --git a/openmetadata-docs/content/connectors/database/snowflake/index.md b/openmetadata-docs/content/connectors/database/snowflake/index.md index dac7079167f..2621c3f8bb3 100644 --- a/openmetadata-docs/content/connectors/database/snowflake/index.md +++ b/openmetadata-docs/content/connectors/database/snowflake/index.md @@ -45,6 +45,7 @@ custom Airflow plugins to handle the workflow deployment. +- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas. - While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table. For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database). - If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.