mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-10-17 03:38:18 +00:00
Snoflake & S3 permission update in docs (#8296)
This commit is contained in:
parent
1ef189540f
commit
45adf285ba
@ -21,6 +21,29 @@ To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
|
|||||||
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
|
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
|
||||||
custom Airflow plugins to handle the workflow deployment.
|
custom Airflow plugins to handle the workflow deployment.
|
||||||
|
|
||||||
|
** S3 Permissions **
|
||||||
|
|
||||||
|
<p> To execute metadata extraction AWS account should have enough access to fetch required data. The <strong>Bucket Policy</strong> in AWS requires at least these permissions: </p>
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Action": [
|
||||||
|
"s3:GetObject",
|
||||||
|
"s3:ListBucket"
|
||||||
|
],
|
||||||
|
"Resource": [
|
||||||
|
"arn:aws:s3:::<my bucket>",
|
||||||
|
"arn:aws:s3:::<my bucket>/*"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
### Python Requirements
|
### Python Requirements
|
||||||
|
|
||||||
To run the Datalake ingestion, you will need to install:
|
To run the Datalake ingestion, you will need to install:
|
||||||
|
@ -21,6 +21,29 @@ To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
|
|||||||
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
|
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
|
||||||
custom Airflow plugins to handle the workflow deployment.
|
custom Airflow plugins to handle the workflow deployment.
|
||||||
|
|
||||||
|
** S3 Permissions **
|
||||||
|
|
||||||
|
<p> To execute metadata extraction AWS account should have enough access to fetch required data. The <strong>Bucket Policy</strong> in AWS requires at least these permissions: </p>
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Action": [
|
||||||
|
"s3:GetObject",
|
||||||
|
"s3:ListBucket"
|
||||||
|
],
|
||||||
|
"Resource": [
|
||||||
|
"arn:aws:s3:::<my bucket>",
|
||||||
|
"arn:aws:s3:::<my bucket>/*"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
### Python Requirements
|
### Python Requirements
|
||||||
|
|
||||||
To run the Datalake ingestion, you will need to install:
|
To run the Datalake ingestion, you will need to install:
|
||||||
|
@ -40,6 +40,29 @@ To deploy OpenMetadata, check the <a href="/deployment">Deployment</a> guides.
|
|||||||
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
|
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
|
||||||
custom Airflow plugins to handle the workflow deployment.
|
custom Airflow plugins to handle the workflow deployment.
|
||||||
|
|
||||||
|
** S3 Permissions **
|
||||||
|
|
||||||
|
<p> To execute metadata extraction AWS account should have enough access to fetch required data. The <strong>Bucket Policy</strong> in AWS requires at least these permissions: </p>
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Action": [
|
||||||
|
"s3:GetObject",
|
||||||
|
"s3:ListBucket"
|
||||||
|
],
|
||||||
|
"Resource": [
|
||||||
|
"arn:aws:s3:::<my bucket>",
|
||||||
|
"arn:aws:s3:::<my bucket>/*"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
## Metadata Ingestion
|
## Metadata Ingestion
|
||||||
|
|
||||||
### 1. Visit the Services Page
|
### 1. Visit the Services Page
|
||||||
|
@ -39,6 +39,7 @@ pip3 install "openmetadata-ingestion[snowflake-usage]"
|
|||||||
|
|
||||||
<Note>
|
<Note>
|
||||||
|
|
||||||
|
- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas.
|
||||||
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
|
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
|
||||||
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
|
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
|
||||||
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.
|
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.
|
||||||
|
@ -39,6 +39,7 @@ pip3 install "openmetadata-ingestion[snowflake-usage]"
|
|||||||
|
|
||||||
<Note>
|
<Note>
|
||||||
|
|
||||||
|
- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas.
|
||||||
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
|
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
|
||||||
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
|
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
|
||||||
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.
|
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.
|
||||||
|
@ -45,6 +45,7 @@ custom Airflow plugins to handle the workflow deployment.
|
|||||||
|
|
||||||
<Note>
|
<Note>
|
||||||
|
|
||||||
|
- To ingest basic metadata snowflake user must have at least `USAGE` privileges on required schemas.
|
||||||
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
|
- While running the usage workflow, Openmetadata fetches the query logs by querying `snowflake.account_usage.query_history` table.
|
||||||
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
|
For this the snowflake user should be granted the `ACCOUNTADMIN` role (or a role granted IMPORTED PRIVILEGES on the database).
|
||||||
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.
|
- If ingesting tags, the user should also have permissions to query `snowflake.account_usage.tag_references`.
|
||||||
|
Loading…
x
Reference in New Issue
Block a user