Snoflake & S3 permission update in docs (#8296)

This commit is contained in:
Mayur Singal 2022-10-20 23:26:13 +05:30 committed by GitHub
parent 1ef189540f
commit 45adf285ba
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 72 additions and 0 deletions

View File

@ -21,6 +21,29 @@ To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment. custom Airflow plugins to handle the workflow deployment.
** S3 Permissions **
<p> To execute metadata extraction AWS account should have enough access to fetch required data. The <strong>Bucket Policy</strong> in AWS requires at least these permissions: </p>
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<my bucket>",
"arn:aws:s3:::<my bucket>/*"
]
}
]
}
```
### Python Requirements ### Python Requirements
To run the Datalake ingestion, you will need to install: To run the Datalake ingestion, you will need to install:

View File

@ -21,6 +21,29 @@ To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment. custom Airflow plugins to handle the workflow deployment.
** S3 Permissions **
<p> To execute metadata extraction AWS account should have enough access to fetch required data. The <strong>Bucket Policy</strong> in AWS requires at least these permissions: </p>
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<my bucket>",
"arn:aws:s3:::<my bucket>/*"
]
}
]
}
```
### Python Requirements ### Python Requirements
To run the Datalake ingestion, you will need to install: To run the Datalake ingestion, you will need to install:

View File

@ -40,6 +40,29 @@ To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment. custom Airflow plugins to handle the workflow deployment.
** S3 Permissions **
<p> To execute metadata extraction AWS account should have enough access to fetch required data. The <strong>Bucket Policy</strong> in AWS requires at least these permissions: </p>
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<my bucket>",
"arn:aws:s3:::<my bucket>/*"
]
}
]
}
```
## Metadata Ingestion ## Metadata Ingestion
### 1. Visit the Services Page ### 1. Visit the Services Page

View File

@ -39,6 +39,7 @@ pip3 install "openmetadata-ingestion[snowflake-usage]"
<Note> <Note>
- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas.
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table. - While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database). For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`. - If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.

View File

@ -39,6 +39,7 @@ pip3 install "openmetadata-ingestion[snowflake-usage]"
<Note> <Note>
- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas.
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table. - While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database). For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`. - If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.

View File

@ -45,6 +45,7 @@ custom Airflow plugins to handle the workflow deployment.
<Note> <Note>
- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas.
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table. - While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database). For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`. - If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.